var/home/core/zuul-output/0000755000175000017500000000000015150157377014540 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015150171726015476 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000320351015150171531020251 0ustar corecoreYikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9Gf܅%~p6b}Wߟ/nm͊wqɻlOxN_P??xI[mEy},fۮWe~7Nû/wb~1;ZxsY~ݳ( 2[$7۫j{Zw鶾z?&~|XLXlN_/:oXx$%X"LADA@@tkޕf{5Wbx=@^J})K3x~JkwI|YowS˷j̶֛]/8 N Rm(of`\r\L>{Jm ykZR׏!#Šgv cXk?`;'`&R7߿YKS'owHF6":=3Ȑ 3xҝd){Ts}cZ%BdARO#-o"D"ޮrFg4" 0ʡPBU[fi;dYu' IAgfPF:c0Ys66q tH6#.`$vlLH}ޭA㑝V0>|J\Pg\W#NqɌDSd1d9nT#Abn q1J# !8,$RNI? j!bE"o j/o\E`r"hA ós yi\[.!=A(%Ud,Sc̝G?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;o_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾgm\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0Ac/T%;m ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d%1}Uhs;H?)M"뛲@.Cs*H _0:P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}~Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&V<UKkZ{iqi :íy˧FR1u)X9 f΁U ~5batx|ELU:T'T[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄ{g6R/wD_tՄ.F+HP'AE; J j"b~|!p+,ICE^fu `|M3J#BQȌ6D.sQx`j#(g6zƯRo(lџŤnE7^k(|(4s\9#.\r= (mO(f=rWmd'rDZ~;o\mkmB`s ~7!GdјCyEߖs|n|zu0VhI|/{}BC6q>HĜ]Xgy G[Ŷ.|37xo=N4wjDH>:&EOΆ<䧊1v@b~V7!JC.t?)]t=m38v?Lsb s "NDr3\{J KP/ߢ/emPW֦?>Y5p&nr0:9%Ws$Wc0FS=>Qp:!DE5^9-0 R2ڲ]ew۵jI\'iħ1 {\FPG"$$ {+!˨?EP' =@~edF \r!٤ã_e=P1W3c +A)9V ]rVmeK\4? 8'*MTox6[qn2XwK\^-ޖA2U]E_Dm5^"d*MQǜq؈f+C/tfRxeKboc5Iv{K TV}uuyk s" &ﱏҞO/ont~]5\ʅSHwӍq6Ung'!! e#@\YV,4&`-6 E=߶EYE=P/ߐlnbu;Bx`C:gev\g7 6܋ hH+P5co.Q/cC@.sABC{تI=;̶n2¦l::s@ -Mn3䦴mHЭj !'T9Xsl o:d lzzMvYź ^ ٲAPm쪊m\9htwmjQ\c5&,|^C.SS P󂏛o n8Fkb^s/&a[s~W &ɿ^\r\ߺnqZV@z%=\#|-3ڝa$ΫM|-LsXY r# v&讳YE 6X̀v"@L'aEN^8 n`т ti6{b?-X;|iDɈP͐b7jk *bmc`  SgkmOl7^~xAE,Pmqs;l};Щ۸l?28Ćn.I0Yhu ;ZeY}Qg?lvחzäTC 4zv)|Vy7߯@qC cN ͯ~1-b }kAn=)m 3fo˶_ XJNC5B~6%d+, Z`ͲH-nမ^WbPFtOfD]c9\w+ea~~{;Vm >|WAޭi`HbIãE{%&4]Iw Wjoru ݜmKnZ<X; ۢ( nx K8.|DXb +*598;w)zp:̊~;͞)6vnM!N5Cu!8Wq/`FUwWAֻ,Qu W@ Fi:K [Av*_958]a:pmQ&'ᚡmi@ zF(n&P;)_]µ!doR0`pl`~9Fk[ٺ+4Hhao-jϸ??R<lb#P-^39T|L /~p│x@Bq"M/lja\b݋af LnU*P(8W[U6WX ZoѶ^SH:K:%Qvl\bSQp#YI$A@EEdT+w';'A7㢢V"+aQ33^ќz9Ӂ;=^ۭ7h9 lr_qSq-XbsK، JBJbeOfOAsg31zYYy[N 1m٢ڶEͦAc?-֋6rR)? I?ytwpC'P/9} ƘwXe就9bQQ!.(GNp$d(3 %רx%z(o6jp}vE#!3M. x!0=k$}  L&T+̔6vmEl 05 D"wO>"J~7+0_t[%XU͍ &dtO:odtRWon%*44JٵK+Woc.F3 %N%FF"HH"\$ۤ_5UWd̡bh塘ZRI&{3TUFp/:4TƳ5[۲yzz+ 4D.Ճ`!TnPFp':.4dMFN=/5ܙz,4kA<:z7y0^} "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ G׌MU.APf\M*t*vw]xo{:l[n=`smFQµtxx7/W%g!&^=SzDNew(æ*m3D Bo.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq<-uˮ◶>waPcPw3``m- } vS¢=j=1 W=&;JW(7b ?Q.|K,ϩ3g)D͵Q5PBj(h<[rqTɈjM-y͢FY~p_~O5-֠kDNTͷItI1mk"@$AǏ}%S5<`d+0o,AրcbvJ2O`gA2Ȏp@Z#"U4Xk1G;7#m eji'ĒGIqB//(O &1I;svHd=mJW~ړUCOīpAiB^MP=MQ`=JB!"]b6Ƞi]ItЀ'Vf:yo=K˞r:( n72-˒#K9T\aVܩO "^OF1%e"xm뻱~0GBeFO0ޑ]w(zM6j\v00ׅYɓHڦd%NzT@gID!EL2$%Ӧ{(gL pWkn\SDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]}\E$V*#(G;3U-;q7KǰfξC?ke`~UK mtIC8^P߼fub8P銗KDi'U6K×5 .]H<$ ^D'!" b1D8,?tT q lKxDȜOY2S3ҁ%mo(YT\3}sѦoY=-- /IDd6Gs =[F۴'c,QAIٰ9JXOz);B= @%AIt0v[Ƿ&FJE͙A~IQ%iShnMІt.޿>q=$ts,cJZڗOx2c6 .1zҪR "^Q[ TF )㢥M-GicQ\BL(hO7zNa>>'(Kgc{>/MoD8q̒vv73'9pM&jV3=ɹvYƛ{3iψI4Kp5 d2oOgd||K>R1Qzi#f>夑3KմԔ萴%|xyr>ķx>{E>Z4Ӥ͋#+hI{hNZt 9`b˝`yB,Ȍ=6Z" 8L O)&On?7\7ix@ D_P"~GijbɠM&HtpR:4Si גt&ngb9%islԃ)Hc`ebw|Ī Zg_0FRYeO:F)O>UD;;MY,2ڨi"R"*R2s@AK/u5,b#u>cY^*xkJ7C~pۊ ~;ɰ@ՙ.rT?m0:;}d8ۈ ݨW>.[Vhi̒;̥_9$W!p.zu~9x۾vC;kN?WƟ+fx3SuKQqxST Ζ2%?T74a{N8;lr`$pZds=3jwlL Eڲ t|*n8[#yN SrA GYb8ZIaʼn8 #fg3i`F#5N 3q_M]j 8E!@1vցP7!|+R@;HspSI]ڻCZUcg5pDcIϹ,oN-_XI,3\j ]ٟ5~' SuipA!C厐$&k7dmhz/#"݃,YqCL$ڲ`"MUbeT>Xuv~4Le͢ }UVM)[A`b}mcE]LCEg=2ȴcmZ?E*-8nhױ1xR2ϫCya` A y!?h!9yL%VLU2gr26A!4vbSG ]ꧧWp/ &ee *w$-`J\ ptǣC^p#_`{ К8EW>*(D{ٛ,[fnY𱹞M=6&$<,"lX-Ǐ_whaE 98 (oѢ/Р΅ 7ցl6618ł_1/=fu).s¯?.S[{'g=Ҥ):d8h\y6]t1T7IUV:;.1& ,5΀j:<< +Y?58In'bXIǣO{&V\DŽ0,9f O_"[l:h¢8wݓ19\:f6:+ .3}=uvKc ٹeS<>ij(o'ciS<{1$E[nP b?8E'xv[K+E{,Qƙ1*dcs_Z'407|qBOgYU|U--sG8`u! qGYܷw;ȌCPc_|(RaIBKb+{P.T! =ĦiTob d<>SHr][KqWs7ѝBYǭ~RR"p9dFg|K- obY_vM 4>/]e/dy,8!xŋ5 R<^mYo 3c9(F?h04SRm+0^PTi-"] O('@BKD6 {NmʐzRj.aQcb^CZ-uvpr CѐٱlGNzIveca=%1Qi F>wTLHUGӃ\sA֎Xpljlv ^tSȻ \cPwίwX"{>9V0ټ_`#U8VdTtD_GU9V ұ{q:ObUi7s )B ۊZlzIA4S#x,T3ѱ ԶJ=rs>Nb: Q6ˌ߉J%.Dl2ȱ%ܱ&6XƟ6qg(USok+Po$lwvmi8W_VT18V =| ub6QWCnY'"*aN08wuSEAVخ m3 o\` sHc# fqT .,ŀU|⦍߶/*~48âF,#[:y_YIpʼn)dk!J'Z5=r&; (y*b*O_ULT.ÔD[%s1,jЅ@k0Ցu֯dtKl$Y5O*GUڇvI`b0ο0~oI`b#FOf_$0!i rS/wvҍ%Eb/Ec|U9F-)L)ŘF`U:VK jeFrԋ7EDYpԽ.D\dNyj荊EEg]bÔF˩ք%EGƶ*NX)Hc(<|q@Oޯr^3>Uf1w;mCja:-1_k٘%VbZ˙#G6 `q+MPU~l!.?I_Pĝ"] rT [eTr؟˰ ]\ h! v˱>5S1px fnk}sRmA>d2UAkؖvlX܇Bz1U_#Xӫ+al H d\k/I,k,ρ|`zR/$@8VU^rcG"E7\qtS:ڝUyy >Vc11*?xYa8U`Jw/AcL~|;yj8TR#s"Q.ϊ/Yrx+u6*27fǪC%+A~*Zآ'ѭnۡ|< a1s\ T5҃FZh?EV"sd!@БU ^p%pO3|B5=2怕nwRqR9~ i±za+HFNi>. EWz:V^&YEs5Ȭ N *7{!fRБBSۘ† Er/IGU}APQT]|XN X]FbKjKdO U6[3TTX)|*H'2U0:VunBl  `5/@ա06VNO8VGON@KgjyK?Wq1egI+ I.*F~L!Gf"LD&U 6tGd#fR*c ^tSLjnKS9 Ȼ \ >lr&}+̼d"I va,Jm_u)d靕َ| Vw85F3Liƙb<;dM-})C?Fw*IJ_3UG'+¨[9| >80\+ xJpΕ`p~mg˗%F Rg(6=/r+%a>w Ohght uЍaRs ^d6GXAf?V_mW puȇ S:tŴvŀU#-*mZ5k5r)_x*8ͼx@(k:_TX%[paRu~}#Ѥr %A%`;MxB[CzR怕#H% }8@*AM.SEhd,rKrʇ)br\+! s1CtӒNc_:F*`Nv;ogQFa2V%ZniE|nZ&-I,t*ώlo Lhnٓ'Xm R ˍ-~ά}hs\5TT%~am.>!LcoJrKmqvez܅E9t6FZXgsreHhlٷ+ [}r:̓?W~e6>0E8`Jq-(ed;W¨:Ä&]䒿e;0:|$Ȃ1L-%;Ƅ{dɱL;V[bp>!n&աIJX1$9;[?- й vRCxKVV+#lj@_RL;IQ8ŢΌXD@Z< (1ZRÜ:OUM/vư{'jYXE4S/8 7: `/ +G\ U>]B2/n2=8) B gJ3bcKo̹ʇ\B~Is 2sO/I!}xV&\b<9$4Nve^آ]$LGF@LjKٕyzH 31Հm-XıUXF|\A-2) ' RG6h?āUŔyj[j_ӂ~ яA弆^bDyzǖQ8`jXbsK?l58,?YP5䜭ve9YFznTEf3Ja\,@2,?WYؾNr<V` =V[oB5!Z\ļǪЎr8@*ucѡv\[|s L-+y{5K@dzp`r"mũɸHNd"yc Pu>x2;W`_VR<aӗ&D<=h-Rר|/r _ǖھcߖ]G@Ն;UQG1 '3Jە Q88ASUȿ!:WѥLf21;d9OU᧯MR3V:<}xXh//T+coY5Ȧ4/m0NE(G2[+G~H'5ipӘ͏O +Px SPp.,?Uv|$kmIEy %6 ``g&clD; $)Eן&%ˎULb'ŮuJIRQ4Ip-%2TDw զ(e1T%ɗ4_3?T5(x80.<"*D)QeR҆[HcR4+E z9NJ NI闷={:%e+ON0C?64s,t^_+c/B=4ma,lƦ+ܗŕc0+ssuF^2m&d/P {6ƌpڡ9j9Ƒ0F ۆmo,OjyYъ'jL{#m~uWb*4~,_nZ bĨI&0ˋy:x'3( Z8L/HM\2 M߰^5XxA:y?GvDKO72O9`/IOS0ێ$}}OG.xbl5+ sq}; #قZRkku H yw(O΁Wo? B7. Bw.t´Ǧ< 쵣ͣ:zЂV7Q1`=כGp u\%B _˵uX}+iu8~I7S7\۵_"J$T込2Kc,%B>@%[]x)2~wP! dnpE*H ],lg Npy9=N&{m {c]'l'&1GG1V)úlБr8Hmga~m &l!IGwvG<{+IT16itJE=nVn=j}Gwy.C$8;;wTn;>'AaIS,:/`N?P]7485 h8&MZlzGQfqe 2;CM-,^8Ol! nQemohx"L O1~ytciJ <:2i {=9y0z˦ A!lZɵRP@]qASEcS/{EOymIø!V# k8+>9u?tTk(RA"CIH?*^}xJN(AǼ m0!ij^hCI:my-Q:º/>CExOnk8@̦xB<"lJ*(FP.Gw! A1vWw{b'C$0ϛr &oHY xʦYEQsx?tĥ85SEo#rEoל'=N># ")g׈ b:ф<eN*L.Pw:)y ~"hPL[ h3I!g444U6JJ@m˄FQў)-A2&jlEU8̦ҭ`Ef-kHx{x#,ٺ%H765- Q<a) Y!QI6UM/-+ zmA܈<1тfaDvmT\VK+g+̇`6N`\dxbռ -W[ADf@is}>>Gt(K<;i|:u2Z@=ݿeQsTXs4eCb@FխmHYx5=D̛>$Wmw鷯E4x K4$Ԟ%Uk[VW0^ے`OjMk dE^Yu45eoAmՔaμgI*My.}*Oa-@&jmʳ5du"FK<h;붩pCCEh`ZS=+_wl6|+QOWźvR1lmh'|sKGna?>kq]33;ϦZdֽm+QhLl+ֶ3 j,[ 2RiMI"@WϒXT֔)ъ1U-`Ɗc!>s]?J 󶠍%V'?$0%#R- %mm(kFX<~ P/!櫚f UzdS kKm wA:)K 87`tpƌڎQKsS1㿭fk6\8|Og𓼺HrD Z%I!W)\ombӵ$f|Su(ZbEG}A0UX+ ϙSy{HZ)]a-qxmg!ÃUDgƳ|Cg(xU]x  ,a?tRz0*G;ōPJ+R9TBrEQ7HUu΋o׌:󍽘P ܃Ƃ0 3B!4a+B!6y(B"7wH):{1x֞G0 W^a(o$^r!ۏň0ϙzŲ~x"`{Ӫ8>MՕ=6`cpth9{6A{7G PY0LEP9}ΎPeX GA@VF@0?3)=BB|Zᱽ/{47OX?ͨZɈ3} swtqD"?< q^̕F(',ztɈȹ~Oi"f&6q>XUo1C?n|.gGEދlSzw&ON7R.DnBD&<ŵg =|@|=E5{bUY9y$ 9œon6=@LGOBDgæKR%)ޞMB|8hKO9[BYDlhJ(H?2+BBټߑv޵0mk鿂$$q$N|ݙ&Hf#dsdّ 9bq-΋fɛu(٭nx'K\}d{?yVʗ=WS>6ާ%uD&;~7-rL4&+N$@+zg IC_ǹ\E1iUQSNN0i@c?\2 |J̋zLzA$TNV@p )Lkժ8mpRk5/RXqi?I| vE*^uùp(&@یC(p$nr [3<" K>n;"$P^"4?$ vlE ʰcKЍY]nDЅsw:(®D1ўA!wYt4 =ww|ص] EP-z#K( wL]J``1zݢ& ,PP% 8FPA`Vj `gcab|^Ҁ !8֑ <%w, N%ADa72Q7ݎ,crQ[Sq;x]䶁ö9AKx| 33n{EP0?p3~ݱZ:"q;:lbґT$vswkSƺH:O83~&ܡ-ǼK);J'#8` 88#:8 :a]rjn @ : VcS9yPإxΞgqxn+jP] +JЙM h@nٯ83p{ '|+nEC[r,B :ev 6:PNqn)1jh!@:9FWޖX R@n5dy\*1H QZ%$ٶw|sq K 쎅rt=[%Am r4=[JH](kE'`nvzs9;Mzx|c_~e/mACCu8ΡHUkR{82y=47rY;'}َfhG8̃k}(cA,Uվ5)Gdv xb]K$ b u\ pDL>22atG٬\(9݆ȇ #&u~Y}܋)* yH$SHE9õKO`%X+pN!lZGZ,K3t乮7>(A5],WְxLfɔ8Aڬ8 FȒYΎ $:K_`9瞭ѥ=ѷ٧8KQ`HkJ*i$ px~ 6l0G%is KLb=̔B(o@I4@\ zH V} Ij\ k@Փ@|Gھq/a^sN|02hf׃4v߳mq- ԕM}WgI3 KtMṗth3ħmq nPK'؊~jMP"N\UdI^'cP82&QMpDfJZkBKUbiɾոՀxs=,0 ĸU b뽪V*u\[ .T؏UyY<\`sj{,"(,g1;z5熋A;srx׬H(: GrprD'h>askx\Dzr\NÐ"큃NCliyA}%Ы?/S4DN{.q .Z~ ly$;|P'Isy7qx XR6xFxG…˦P1K` ӴS(&Ao8Bn2 5G Vx ԖQ߅=%0p Ǐ\OqY4,^ ٢%/`h}rS<ҌhXm^+ z棷M,t>Ez#0I07\3~E(Qڀ>`aǺH[}ژX,;QϹ'T}>bm`PidAl V1Esa5"UAx LgnX/8N(~ "wDD..YyD G(/gSx08auvljk>JQ?(YF}ÈRzOӵOydQ|ii׏QyzvEp;qшaM2/.zpfMKq%~_uO {:x/QE!pE^v'򾜖pvҾJnѪWpM 7`[O; _r(!&_lQh`mtH򈺮(z*؇"hkPּAl6rTŰ_M-tPtsBO$nBJM>S3{>[FڱS}ݤw2lb洲ʞH)ۀPgPgB uGDB u u7 ݜPyO$݀PL(߀P9y'7 [&ۀPosB=PoBeB 7' 7 4X&4؀`sB<`BeB  7'4| Kg(%N"k(o "ܬjn"ό&g 8/gYnʟA+$`ZuF~ɋqywN\Cی?Ab_N+ gpWLz7b>*@ѐ:Nb*Жq&OR<}w,NA'M4[nւ.ǷiROD~^R7F3svC4Il^yRrG0B-",&woE,tt(Hխqd_K.xBjy4>kK߽;9ǵ Êza-zDU ݹ%eEd?ܖF@XQV] 3`G(USXY& leACw[λ{֫bTVIp{ִLEZQ1Wb=U6ɪxCpI&B^0/+Opˬ1%PͨBQʰ_(KV炄c`~K0}Co8z^ ugY2|A؞CқBφK6{,N XTj,l6iςYxx&ZV ;DZiQN2L",;K5^W0鹈+ ҃NT,*<'9lzrva υ?=]e+뙒,,gM9_K!:SU )n((O#0ZBG6]vtLȗ3Zf/-8t]2gW6%ʞD:(х+weI>biߺlSHH.pAbzE'EO.,.FQIsb.o ? vCOZ@q0Acf"oDȐrN1pyZ$) Xƒ9 V~1j9;`43)]0u*|FW[ LPb8 3aU8\wXL|K?厪\3/A#Ƚ`$r&JN;"#c #x kG;"o^8 Lٟҕdzߜ#x1ܶT:{ǚї㣊y"+ rx'2oGQ~5K_cYK,BR[IJM}h~uۧA2q)Gկ Ԅ.i{&rd+QW V 8oAଛHF Ʉ|%* [COKҡ!ZеT-Vm, ᦬f(;c`(_REH0UzU6-iy ]'nHf0Tj30m ΒlIqMqW)ᙡrja8dl$AҒU,5ۃFm_owFSS= &*И8ZfrR_H`pY+8 /i&rs j/T¶9AF@Zv?vrμ1j DJ!싣;x[ s"\QG})6D״ާMI8TE֪CQ9` `zšlcͨDx $coY[>XA:WEdl}k^_}G 9bR Cr .a)sfQxio}3C"G.Q]aB_OcsRmE3=Pc[i|=53zz/Zz- ',L  P慁WD}لN!CtL3Ym|awKr{rms!q!ȥE(gkGth /ȑY (⧖*rC8a2=XyéTpESOʲJBl9Zm.JfⷔP XQ|0A[u<T&Kiۙ0gmJ)bԢk14*aWtPm"<ףxlȱug/̘i6_9*4Dc,ZA[0 gǽw[o٠ У:kg4|Ndݠ{DB5qH{J!vMO򰜙ZyEM[QLD[/i QYE̺z h-a6ԥʰ9o1h8uӚ؅>!e9iT?10iP̡ JQ4ش'~GKkE>[f{5&Oe8FIvs^*Ot3l{?'ښJgu4rUT*~O\I  Rb==yL8R+֡`VO%3o?}p<+ʇG˅*/]|@y~4q}hT'V>\v f=o /m0mFaaM}+Zrc7 Y@{+@#Rh E j mwߺ hz`oA0IP+!B-P:"ҋHg[S ǣ1GGGF0w%{|HIoq삠`p#pL;༖` ELw;?u(vEh3:P4cFUhGt>86a>y@XT j\O('QGwg0CaDD!bHtTuQ}- 3r8PL7g 94)FxF#$&eЇ'I' ZF 0x3'vDx)8.xxƨ0e< O"2bLgOѶ6ck uAA4Lm!*Lt3pqo4)N x\ XXBj)I2E >6кMI[/-b<' 1F4?Sφ*$Q)ybAoYRlo|HF4 貾EǨTqu;sY zEgQ='G.i_q$yڷ8# -*?H~(;`,QuAe1 ^%' I xKau~d—: =!PK4&Wڌ9D'Z23:Ip1p|#շ&ʃk^3iTc6)eG?GSՅ5#:.1&E,k6_$>l&(ia-k]5Pp*d5xx1$i ER+NOuj@aZ"kK͙)|? @F\JRS$]sΓUE0/F9Pt*p( TFMz*CA'Z.\۴b@IejǂK`sDs쁟R*֝(vZtPBICCZĤD8^C뤍].8ڱJJL Pj2f c \cGC߭p&DOɳ '>:LlR*\xnEҘV|QcI%Xg,vpn 9Dޚ.zUQi4Hf3zert48#6Y셽?vEYxScLI т _1&gdJٟ i؍#)[& *T]<'CH8 X;޴`X6]4q~j m{0S[{OUG8ODda 0a__UڰJȣH։.8ڱ8yւ QmefOP(b`Fy'Z8חN1vբVf2e+j@2{{JB xnk}"von>J"QIuǶ#ZR@V|[*>5pl%*I#,@ xKKfSPռ  e^F)ОgcH ֱ S]$E XQH?@9!*pBURFYC?lA&e c;8;v6('YZī#mLܨEgtײvH.(]o>?.8 I`[Jp"*09AQN:IԱ3ɵ[鰚N(>Gn]x.Z=^q!іy#Y9 >%cц<5VG҂ҟuBϟnoǠoWe 9' J$*Z2)6+:hVEM;fa}>ukYЕҵi/$Fɬ1R4j!8A:lK(թ >[SPE jɵ@xEtI`'cӢX0Yȁ: YV5!='sZzuQ n:"MbQIi5~631q'vArzSoۻͥL:9|a10\ϋ3pS 9 Q&h*Q$#;fZuy]φuyQtK_FXDdñ1e%N\.o |[mmMy j)c8SrȬRK *_TZ#8ֶQ :(E?4ES U%TFe5^J}Y76ꃣ]֬ę2(wD.M* eM5Bnmk}J:\Q#`(%m YlIGO]pbQ;/6asvWNay9x YoslYu3vjO^9sBe6tn̆~i]^}%hrRxmh$Y )|#I`~o8V}Zn'(`*kg %74hiϭ)X-aS}L"ǎ$,d:TK bJ`ҡE)./ƻ, (y[%[T TD>v;1ޭc=O!rH5l48F߅FY+ˤPqp mq'2]oBfW/r8M3/'e bڊ|mG(/r bqT]Ļ:-XM5_1dwï)!֩5!F+iЎ^_ikvo9wjMgBo6 'n^r(fjP C1w \:llr oG{]+|kz as į1Si`:hRX d>87ILb@S߈kRP%#bQ@i r3k^_.ivUIwѸES|ݍ낣]Fa*bWFTlʕgRnY`^}1#wF.'d$zIDzqL%=GizTW>9#4{|I]o"!x*4))ȭ˄:L/5<>NeF8nK.1EhHC8k"2DX1PsJYb "I8]koH*ה3u5.8n Y;'BҍQڌ+p"$M)P$]MvQEfÅ=1 Yt gHYa:Czu3Ǜ|e^D \ۮloq<<̢˔cZ6*l%Pg5]1ER|BeκiH#沔(i ARO]Ԕ, ^$oԮrѹhVh5}Ox]K!r4߮a!ݮ6C؛ɵ\gk a=B䊁!Hi5mBߠE"Ea|k8RhzzpbbB:tuKF $6D=#cI/Ȝ[mFNK,+WQ{2>>BШer%P{fYuY3RںVCۛզ ^.ե4KW36ŵUVj /r*co[ >F+ci.ˌx"魡삠\֩pQzQ_3ɯ~^{y]d]QN+y|hCZ`^8ZIbv71nnfQ[y3fb^e8]kO#G+-KMV~ABQ4fGov2B+f6v6ئ}k wSsU:|b:|iy#εТU]5zʮ:VUOp1{Ǧ@AoF@H}?H d6~.&vI(?k~/y//o5Ǐ^gsy7 RD@?=|te0`VIJe/%, &.s.~OZ+Ezu"=ĨH2q?&L.zСreI%3áqWG>_"j/M7/^ ^o,\3:,UvMd,0D Fc]{pv˚HPuI$I)-s>&?t(KRJL?Jq ==sz=I= ^ :ԓο1y ȯ=>*A;avenfwamv{U^;;KH$-48*]8a1S?PRU'\PuGѤ;JP, KeŇ?Uk8d\ 7QrG<IKCy,Bqp->R;_W`C @ gI6t7UCE.>NFGG* 1\ܻ2iK0K%Yڿ?@ l|?LK ZXϧ" dyʊ^$#Zzi*QuY-mWGd)7;)Vt ygR)?>O>y[` PVӐ`$c֨Z¶/ t3oB7cvcW'[38hxGU:SA)ZE>b~fMc?7ո7C"y©7cϕ1*k T(>W'.8fŘ"(6y$ٺi!g"9p}z*˻U8w\qX.U[Cm}"=I;z/@x|)T%Oq]#Y%kHGҫN=<ۭ5C{fx`gD~ i*nq[pi X .m-ZiM "mKm)iAi7hDdOkUm2S=lA8n_I8i6pJۤ ~dɓ܆虈+&],AjQen5ѐ(˅ذXyѰhFfd:Ҝ1mKp$X:OTVh0>Uc4Y7"dᏼfn7TRUGR?_@ƸPTݿI!Sg:g"b,<*`6ؼFhoӽ|1Xd0m1< hWgGT- FzG䵧:BlטL.̦=з$`j`+0"jsG-M[x1`hedmח NpךJCXzSNXWdSa=ՋЫ|ijҀqum+P.{mUy`lk jU ok 7_p̵=[kKAܽ$KJx~㮦\ښz=п-i-=e2}snSҷ|k!d zPH6mޒټՖ[-BaWyZ6o3ndiF)^n~~0&;AL6v[ާ_ (+mw)e:.<{`G;'YE@9(JJܩ1DKqq<%m\v6єmyTьE)2gN0Ap w8gtT)4GJ"/ KDW 5#?[3v5cQBVXr!QNA ):_gșV.cѴFѴgGJ*~~j~dIO"j-w[ELJܥl)ĥhg&-UL%ɨE-ޙ)DžpD]_tGyY[ҠaWVj8bwQsƳlx'[9|6oD%x'*S5{x!߿ɾ>8<_16>뀭9^Xݽ+0pLS>4WcM(H^Ȧae# h@,2 |264TH `mff4O E4T+ @[yK@rTTzl]wg!~/ :be;ԋ|[7 BfSzN90kʗ헆~u~ݖp]a/.΢YB4,뻮)t:_5dU;.NbJ=RoqjmKJ18 8EKI`[4ei"IK^,M(Y CpQ>cmVZ@9 ;yT2MZX,/R ") OUW ϨdE~Jjdqп~( nQ+1H-g+(uqb FA㷥Z|+w嬤>xȂM*RbHB pz8?r])o7V0]`riQG#, Y ck%e'9owPgِyQ4f6?@Iڽ㸒-w8&.klHX&pn"RF1bA+!]`RYÃE ѧ']1YK7)16Bαlq|1d%tm_B$pm0Pč@2H83I*eLE X@q,(v#NZ 2lf9 < X4@{pY= /PJEN8R13H{AȃGLU<iL3 Pފ`w-\ИEh%:\g" !::%58%` 15 ySU-2Ӿ~Pm&!鐇0%e((0 3""UD`Q2+)B:YŘ t=!"Q(;`>M4|a޴SipLa4V@*RR*5Rmi67F6̈́)& 0E .D.m\IlO2ZQ.jrIڤX-)-+Wnjْ*зƪaZ}{~$qq 762 ">H984FaV{l{%41F+%۹ZE`Ю}Ÿ-R)/2!FGCVQE>!8Rb>ca _i4h+ V@grYD H-2"H̉0#xj@ qWVseQ \#y"Rp҄yk+w &C\[XBZ=q5k=b)WM`1Uk-Jv#4](9xN9Ɉm2灜I`uPn)X#%T-67D(((5C:`G[3+[y6 bT|4oI.`PIyCi7|BH<6`7Tm$RX8$AͼRGuѺǰDXNb) "o!2rc&DDP34+=>a O7x`iPW(q?9޵q$2G`6V~?rkl5,6 y:K"h RP.THMï-[ Vi5ڲQ7DW{l-6fzgтvX~W'뵽"tHLA:ҡM.a-@#s-l.&{KzUa/؃WO.=toQBpbbG41DP Hr<6[RKB1` Z؛*bЪkj۴tl0 pm+1C14Z-;<~MǁkM[k+lHk[hcdokz&n\.HZA,.bp LI2yf|@kR{@lPnV< S r`iEQd7U8&΃LW躚vc5ԝnx'y'FĿipZS.x&A u=!!_'>vt\E0/io.%xO>of_3isf﫶}.ciL|C'M]QcmaӖmP/fRCTw@jZA߿HoXOSONޕ8J(-=l*?X^0Iolyriv` :܀!:'CO'vĵⷺ:Mn_9:>snsMIne&ϯf0iV>l:0?${՟JO[x:ӳNu?\|}qk\`LӋԽnf_> qtk@oQ}^~}ߦcxP890;ү&e?}auT׌ƗA|ؿ 0=`RW:r QJK~H0}aF^5Jât>Hl*5,Ģzz=Of1ݛ қ& ZG4P^V}9/qB0u>Mᠸxk/Y<\VO'i+b?rf߿c*J“wU0+nGr?ׯ>{z{=ީWPx&~M@T!QvZ|װ|p&9R0Mg_NzQeo/;ӳKcLw/*hPY^{w|/o,rT1ĸS$ 69ik;m9&+Nwm Pf;yCL"wSjq88;ͯk_,>3p+?Pe(SRNf30=>w'UUuur~ȝ+fJo>_UzyywDF-[d~ȬL^r*"9#+-KºR}c96<``hI ߫Z@-jL7UycUM23;=KƟO>fdr(6IIo. o%^'o7?, ?(˄qI/_Qv ˓N}6X*lLH(5d 5B 5L`B9u5W`G Ӛwwt5[t 7 Q#kfО7R̀iԁ,hlr;L(ʚ3/1C'0+nk1 3Z=$@"ga}8OyJ8.ؽQa`QFfyvqpEdGڨ ܭ+ԒQn, %n?g3𖻹G avӍj౯ #=a. ũ_`,YR4gbU sb%I`jEt NF5q{ !Jn)[G0-Zwe zrDi"5ӭ1.1.VQKw>]"${"Sμw-X yׁ$!;y`w&U;sl4)>0}+#8(va2+U_Rca.{QmG%è3* 3_og,"iNVBJ./(s+'YfRtw-8Mj!u;[3+3y'#DDUby^0aUvAhAB~]] 74"JdK8x<~7L/k1ԯgfywy2Vu.ĬQ[[Co#TGCTVF&Fքɘg*Xg*XNә+%̛kfzɠ 7jwW6M+|t?M:rf8r)InRq[ƸLXRȼd:39TVo[3m,߮θO3wSMDgn|qU7^@o#p)NLn>8WZ-6^;g.Zz睇C(nA'%q*|\˽{#g@ZFؿ}s\=;j㱖py\O~pD|6;lBSn\Nˬ lFӚ.Lig%uye 9dj,``*i`X1qxhJW=tu]wsW+yfRsn[l`Kۡ)^>)!: ԋ,)rX>sS炬ҩVRW +塥Մbu OPl~LE9wA`L]W3wԁh+^ź!=8{h1)ثQ"wٕ<81JHF"LS`SרcRZ˹Ǥ hl j_;/vi-f!_G}ޗ( lfw5=:N{\ ki[2MJI@SMLyNuͩtTp>b%}b`JNmXŎppvW}XGI.] P\ BXx6vƃ%ۢ5I[qλ+`U܍$m3Q o CB=$` WxHjVqA"jUBݚ=\hkͅZzCf\-j!׉$dm9!v;N<[ot*D0x>M6d dz.ȝ.{$HBiEc< Ѹ@i\|q#i\JX|' ,׿Qi$Zl\86U:+!ŵZD# n)jYfZ(4vӠZSQI4` ג'h(U`R̲g^m`Nu ' h Ud)@0$wZm|tFG@b1q,v)[0/Ev6Ppedb ZKtV$MA>. ̆AZII[VSm .oyQ4?Jfq JG)fqjs ,ZthAX:i T "=tbVE5#N:rEl恂dӢpHh|=tbpq}>;|T{oĒӋJWCx}OëS.t_q|aL]+eiJߞ hْst$ɲ@%!6>/h4XuJ<騤w}&*So1JΟ*OHah&2T<ɨu\ԑ!߾=J-7둱,XW?(!~~aU8IO|4Wh![IriôqC6mUvBU U.Uj#]PUPr ^ٱV!\+T 6lz5R3vɴzBjjζ_; vX*h,]ýɛ~`ggv'Ae{Vܔ֌( ѨKR"Y4 4v!ӚLDʖhRC(GhX`H0<^{ROVT[m^}1LdLBU4LTKM`CO>LHqdJHrj{= `)Id92ƁóQMTؒDj=Ijg}P˾j{:geDeZˑ)[|.R.a{G4%R RH٣ރUnJ>v/UVV{7JdjM Z:C{I:`ϺHt^ c#=NNHw/M|c5ٞ>ӓ~W=W-߀5Q¡& H #y#jw & 5Q8DIe&%U(,(HC@7)@bOԪ`m/#jRuF*/U1Ei{IK#Ci{9{\EQvm ?t?V*I)>K˯~>^?}kk/<58?6SorWӏW':7g]zo5Noֶ,Ccj[*^FickÉ=1ZbW-83p7(]~"aud#Eub08lc#6uFfXJWbD189xSpR @U ]- WWys?!}2>|sK6(Ȓdy_P!ZCzԄP֐!#dӥ-2h /H82/vFՑ$>0Xac"CyʟQy8ڡmyQm24QL)F5UΤhersvB$LrQk/ } 容yJ?Z[cQ! '46#Aʑ#C7'kg7z%zl]5>XZ `>F)Z6^=žGX(- cRR Ȓ9Gx]WYͼ?Y.aepc N3YB RkㅺGbکuR q;2^(A!: mZOHo7!kls6F9\gQNϢl:l&YE9O=͢ SB\b@+mCķlEs~Duc%XH*ίc@d$#H1,hȞcuXY{!2Z/ՇÒ ŰDT̒ր%qśzYs׿{2/=gg>|bi:Ds8[l}$2K&+⇦ƾߴ>,3 @<)rd u>X}FWg{o z4s0F.<*]^]joty 4nP^X S=s|ql4/M{Մ=y,y R킸L84~7^M=o쪋[㬾lg)7j|7Գ>jf߼y}/8eh1 |bx,8jHFnM]MҬ}lm;E ~} YsI+1\A!_dΑd0k xbčBX@2gX-uV|`ј#ؐq$!^r`4yO|"EԶxY}Ǜt)bcuӼrO_H3qzʪ)Sqk[_l8{sfݛ=7fșڑhxzz]xmyb;kZ .sA ]bۊ`H#mxɪd).iaP @_Tco$[P$3ԗ8lbe%%[kdm.ovB"O] ˂lbAl5M g*4^ 5/|w`,(#{Sρ|r%&H4=& eўQi%lsf)"L쥖y#2"e׍m B0PY5$g5 Q @.Bڶ=i!ue_.D\4vToEMM8kS @g49T y>]Ho!X5=Y.d6F1lh@E=[m҂U:@=  e8oH5T 8pEծPue-AA@Sfq2FI0 R (hl/(΃ u:FΚP)9(Z㸯sRSFͼw<  !UsAAUYQUhjwg~y%@aeG(@2@PcmH7r|%nϺu1ȱGxpǘ 0?1T3+3@̈<85lr@ f$Ìdn%DȣQUAFhMNi"#H1"P̸D>:Ug/2%/c2{tׇ\AȘ!8bwĭU2KX5c d<)eܻ4xT{?$ 4  6jVrb(mKX(y~ != !g?+H1[UHYL6GA@1mf^%agSadWA/jLpqc&x#YA { i0v,9K%CѲc2+H1VpYwn؃>7I:s΂0a{ѩK9R}Gn'潶5v1Cc=ZG ۏ B}Fp\ 1 ]iݳOH Hŕ pk=Z|2\Bq 7g89F!#H9wGJ$ ZlkB&#  *GA 1x28pPiHBNB%ZBi4x[R*mgH9#H1RF x'4L>*))ge0mB4 gIy>H1R&)Ͽ;l5ԺLv٢Ӥ檍qssu_J7?=24݉jmwNj2v߻?_~7 O/Nwée髟@_UokN/~ֿO=7qkۨxwr_ozpzvRPbgp5^5EoΉM[o@@)l"o9CgLkմb Gs.  44)sA3xAٟO3,Sv<ѧdj]vvQlGOEI;=E<.H^+/>rW) h"H37U2)~m>=o3DiJC#֌N3>'{ 铸 B  #;_ݑ_vϵnŏ?Bsprf^#a=~;8~Lo{N cf'h.]L>ĕO/z>ޝv1_jO{0yP$Imp(k\#5q]/1,G.? $~[:&fo.J9KǴVM;릝i2MlmC~F*Z|r҆`n@{V[{l8a! ΋EXg'M Jݞnϗ=_Jp ;PDe?_@dw(oM:PP! *JYa(iqЧoDT^)^ R ,-X!dSTvK!5-U/M'wBٟ?)ӰI:ƛCs?;{8͎rQDXlY6v5?HKP\ LkmȲ ’ $ 06aIԐo0}OUnRrKDSn{nj/Y;z a쁼W௃|d~Uғ^܋AP_jQ覼PN6KFhDD2aO|fn|1!0:~;, de&y> :EߥϪWW||{Z.zϦx'9~oP!@K=wƖ4K..|v7=44`n1#KU*|S4&f4 jO>3Wdɒ;Or\raYngW`էY0l<;ۧYUPSXf6G8>m7,v=L}c5gm=\@!FGc_:|ھ_FKׇVeP{(x*j LAe^js? ա +ġP{(8TRr\Ch!{oو>fR9ii1Y>cܴ$E`,-6I1ITAsדPdt^&N+g~"r ƚ =.azP-_ T"-M^ޱ ]{e Pi>@2V=WLrÏU3ܰծv4iKSޫMpp>~{j|&šծssa >]W?Ԡ4ٰl*}:\LS )j퓻 idn24JcU*%ftrtQ:" ֦ S)M 81$o^ɰ|aX5\7!?]< ;] 42\}}"hخDuw10kSeF:c6!m2Mf3IZ7i]T{h '<ܙ˦kC c=buU>L 3Y_7s˖J68Mn )8#EqbM<-*ݥ7(6cI;jzowm銶J5l2 #zҸ݇4FM@0zL`=īʍơv0Э {ͳA^1>4bɇLMU~Xe\pT'̟4LE~,V.U1pOufz'++gԸP|qKeeBoi16=hi1Q,&0ɖ塍uhL]fjͩ ix5RٓM@cL? x'P#`AM;9VNeMU‹7_ eg}ܧ:!D hg ,px>P G – Rr-n5P{ꋶ\i`%{np5&|.}ASܪDZ .OAۢZhw]Ȑ9e5+%זÓֲ:nc-Z$bSuNFBO89mx'\?Xl8\̒~Ү.0)Q`H~Ӯp5ꄖʵ$ePKSf.A/+7 e79'W8=T枕*O>4Z*+ߘo/ӹSgB[wYиoCx/ei1\:|2Hf9h8q[|`R*εςib\pH}T _kb9ƻs] '<`5Ư#y9.ld$c-+@懌ŤL{v2':5ڮ4llq5 u'^/K.6mR_u7Eފȧ7{@g%~=f;+F 5z'TYav{V]*wNٛ!ڣ~y5"!~ϻH𖥠,ђuuXYVtF&)vmeךtS.hiٖ/iPkaٖuCi޲xZB^k 99TtXCЧf("Sk{enpWuAkZ,UCU,{e19/䔱ChYL.H-˛+/ ]G?WQIt]tl{5\K]m-w(F0_EhGкfD>1rmSZ#Zug:XpMx,$G DV.7LTU]Vs:FfqQ +גe~|ОJsu]MU{l?ޖ7_8SX/;ϧ'u~SѲUTЃsva’hIn i#N0eKnM%L2Nh>'Dz5Sg`Ou`],u`MB|~/1Ux5f7,iQN'GѿoKq Xt p4M$t5plrEJ*mSy%t02\_"osA˪Pcl.彲/iM& mT_ w(8x9hˀ*%keXf1Ƴ;ұ[7z54-F3t}Cy*_sDweG)gR,ߚշ[y(+c>┶Sؐ*.hJn EwnI 6;dRU}'Np5y6-vnl}GJKl㣃 @:Z"*k%- u˼O! {],ۇlf`k2Sasyoh>vAz}'XgC<`/ZI^g?ֽBj2:P/xȧo!I\B-(ܯ`?n¹P.sWsIWmr͉`^~=wis/4dPI[*U,?[J_/N:vJzw~>.vՈpu[=?]C&h/OԊ[˳/jӖp&|?j{+FOZpf176J[7c %7p˓^;x2]i}+.z~˯ 㞨ƚʖ۹(b\YDy܌;e8|Bkls?l3Rn:v]Y^OegajMQLN8&G]>;Ǥ:69&L6qΤɉ*ã42OBq\ѠdVQ ҟ> sﻅI&smOF1+bрfH&{R@o,jÕK"U'FiTlv^[$N„ΉI&>9'F*&QOwl26kP-2W!FS&A,Yjּ{+S) mD+NɥeoM)a24#Z)ϑJgj#Zpka|qq} Iɗr2DϸYZe,Tr%%V&*\"$[1 pwS _!*ň) o2( pXC *$w(*aGS8uz37&ӨzFj]6di M,;szU`ߦ[ he Q%9 f%A3bR.a V[/= l:HKؑ/}]n6m\,H栨Y[/`IT{aJj'Z]Rʶ͍hUj|E\ZJ # $8$R lw`6cKFyH> :K\8 8qu~`k0C@&dʇK`5V6SHBC_ x ˤb*O)S>jhIb( $6+uԂ{I%2O&0"060M.? 0=1ن1*x@0H3wt)BA SDc!l 6N-_^weT*X.I*@&5`68 H\Jfڇ CA  B}"=&oCe*b"| ǒgbd#X yJrI1vQ 6CA!2)!8A#MAB7۲BbH,`,Lh5fAUY !#zycBfyL A7Ř`016AC\:LIl pȤPgRv fRP2"*J"AB- HSaN/ D]`#(#mŃZV Rl i- T3.kvF6Ŏ,} Yd%їC˥g wS`x3Xb$_n`B'؅A-)J H$18D^1Qօd\0d3X>SO82/t676Dn/=YɂNU~D}*L* l+ɒC^V(Nok)pF/]@UE␲->I}}6F0a=RsLBA[k$!,yr /:EӒmԐ׀R")Hv 0F5JP1"u Tf8y@ /B!Sv7urFK&oVsK`Н)@w"}R<$W}f6],W'2S*>Q|ʀArQ+p;y29\-F? {a!$TTHAW ;KN5dlJ'rWLe[J?pϐeќsYF5 3˱o \+5M}魨 Y۶,8Ŀ(+vy+ V( 0D ),[$ظu!Ų h#F !JpF:Og]_]wI|JG᛭^}#˶7-VioGU&"TX$*V[&?>uߎѢt$1+$ܴ_o5..הbhkpSrɻQ5y\J۷~' o5o+\|Z}ղ_*_=[{~ͧK$Ӝ:2FWq!= u,PguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPgu^Pws 0v>Bu6ȫ[Sk Hӧ                       Ϋ$2'7|:l:@ uhu~GB~uPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguPguP uw69ze ڟq^}Gֲ\oOO*Ihk7}z4uj ]/. 2le33\ZvN9gV#\29,8\:-KdfI?ev6,a!}X3- "23$1l&`Mi+kagfE?>8?o  sY_`ImԉGLibm&`M:,EEMkvD%zΙbK"aϯ쇇`pum5/Gc>ˣէ_yYZ_<.krڼv8%>|Qo[_U{ bAQrD['J5j&`ֈ23 2sX>~cڑJ7ݼsA ../Ӛ򫳕'&C;S gNȳ Xp=^d&`Rv6Bv %%XsfV+#Qvg`t: 뙀%El׹\lfk.-TW ]%%X{t9y X:07Vvt 2NgkgۃeMuř>r[u`;ZL-rIZDum3K[}v J,+v.`釯Â-҉2^VXmXi ՚` [Ly:xτ_-ZE=|Q\b>}[j`varrNAZi'e^\i+ pTi?~39~lD5!zߴS3Ն#+Xg~\G=/XmԽjQ~ΛONcl\ϮV|}p7u={Ӛ?r8\-񱿝N_,W?LxڑS]09k@n[t_}:Y8m.4JTCuPQ!X,~|fMxo;˿SM0ބ=XgxF{{d)o\U¦ՖVi[' [ҷ-" Xcm8:|a՚`&s ^wvutHXf[o XYx`fӲ M9# Lj+t`=XcE4u\9,^q&`3^>,;#v.`A awص- p %'{xdދaz;6O+u`Z~ȹe"rofO({qۢS7uᦲ2^j*ㅯ]㽯*VS>qBxx%?g7>[l~^D|z{ToEz5Aݯ}6(hx~qu[6~J]ܝZNpxZ\/ފsD{X#l oWlˆ-,XWa}o8x_[C { g/KX)Fq( SGj#>|aFg4ˆq: yVY;ͷρ3R>љ@щ_F <}Rq5Zs%R/r}^׏^gO͇}iR_s}3+޴R/jh۱Jɚz|"Yf߾( )eP*X {yRbQm qL\ע]cU|ϽũtZLkRצ BeFf ;^G%I%U hvH j`BMP!-M%3D;Wcm+:F{yɍh{6kEՐ%(&[j(dFJkŮF56IQe'TU&1 Ӳ0X zUX38f^&0Eip.\^ 0f'zujȺpEYQ%b<۠֫H:صP&RE>Y)m CH1Zqt{I oTGJ s?,1ʚg:xcY1kI{O+O"^.XFYԜ7%Xi9UJrzuռkS4JO:>bsY45nuu(c.*cA?ȷ1 jcAjIe5O^ !ڨb_:Xd8$k{AjW0H@U6V'k\̽Opp : QM8"Y+-қ7 p2x&OV@Kicl5e1'ʺWoW ii}]ؙVО\u*2#+` p٤q"}`=a)-%F,dX@%4DkrkS.hƊa.jh'-` lXvXve sF< g l|ikMehYTB}U:f_9ez-|- Ǣ`22^ LkZ0؝SNސwSA g\ !,-@${-jA$ \ ,,Ou\ؼs:^a87)Ub$ cG p e]uL vF.gvH4*fC)@aS5 wa.BdDűDjDe7>IR[ɔ(2_3Y"SO b66L:!A"+ !M e*B DmV,ijRlH{/uU*t(Hv\P;k q;x$)˚Cn\!iy] ]^Uk:A)PϼvfL:aP$Cu*PGwU+cPu0>AlVjoF]""͘ԞbLpQE(i 1aV9ζ}o߷v%㮯tMy +r>Dx-IԵʚWW,~i؈,}YhV2MJC7]Fl򘂒'8;׋ŗ٩XQGw$]B*  ei 򪌻 &P H׋5h~{* :bE^  a,zKSٰR{;sP6D>G ܗ]qLP}{ۣAJN(CXRmm;$9P$ڝI q r.ZxA*;tv6n-DӜ&+sJwƢ ArDhRydL\^e6"(NYDirF䞶wHrAVjw7qVp(XT*4 HcsU:#zbl('2"[C瀿MGJ!b8+4IxTF+ ׀,'JeoQRL/--Q*~}`Ai[8[3G~ZR0- R6nX렝xF9AtO[xXOfzsU&j@EÃ!QC#uU:9 o'sfc `&Acӿϔc';0ÚT}N(ΓDG. 1iw fa#ܷibpPDF#жDsCyyBqs"zrΈTzA ,;Ш;Di OdPrzGZT}zT [.[xr⿇V )UK/ 䪐U:`w:|\Vdp0 ,ʨ1"DjUcQGQ1ÍaG=jlgsP\УtFt*E CS{[rcwn'LV -jF=dJQ|RUZ3^&L\Qha;jڅ9Op׺KNKA*DESl”X"U|gmQib׋H(| Vp*[6i/B2)ZiᰃGkk|T0(wZ{*.(=uB%Ētb Qp r5'JCb l*0]"I [ˈp٤U]*>D %uPp@4|(m*?t AygJp.8!#7)EN^n6[,{¨PYK,el10*vr]?ƈڱ/^(2Q [z3.o˔bn4Sn:tvF;lq<.1g<)u,\\:Fo:1ތ~8bq1sr"?#Pg:Ҷ)t2E 8yqxYy1ڼ6[Pv@d3A0AAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAu8cdS?ݗp@glCm?|P*gP;u~:: 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0à: n ԁ}::J]swآFmD%z#bZFĢ B5"! `ҵ2]2 c!pRURhUb?!!J!MKM]f9fgWut02JEʧ'_/S-͌sx8 !v7")]+#aTZxߤAm(VB4"VsXV5"VkBk=ֈXMpuoENVAzۈب=\^ jDkDa+#t3І5")mD5RЈXgh_#bRطk 뤗?y1= G?|Z磋BeX'y^~Əa>pg$>0;Yz9? P:Q FyՊ Dد'05e#b!- FyhsƄ}Opl.֚(d+6L,mkDlJV vCXXdlDZE߈XeyysQbZ'd#bqYbu\l'!F jm(/.Z;<=54"V7Kδ'}pN=uN=\wVnltgbpVVJg]3bsnۤ6A4"n/ 6b -|#b1{űet5p,kD*݈XѭF~>ޙ'w'ajDx DxܧW|&³Vy#mO6mYm"`m?ܬ/}{ؽ_mM 34۞{so{mϽ=۞{so{mϽ=۞{so{mϽ=۞{so{mϽ=۞{so{mϽ=ߢGOn q<u  㨔;xP~4:#dk 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0à: 0/Pn-: bu ׅf@@ Hp<:!CEuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAuaPAu9TPvhciOy;ɕ̀:Vݢ«_q^}_wƨ~εr[<_MOb:v5?ŰOqZG.lBs +weCcG؈XBPA!o "ҰVظx,j؈XZ2JK# *'G V2 i__UGˈivvu^G̣X|zR*L)3}`1 vsJgIg@@k.uޫ>.tVBH%*c<_Vi9XQ4"2.ٗFܨ_)\R!̿L3 0>/4;$څUrWQOn~ a)d2Ň&ND2 '^2"P2Jf)L+ ~?xJ-̎{^.HC0vW\@]=Obyj{qWSi:(w%jYvW/þ}א"A6Hn +J":ݕ2jې"7 Iӊ"N"w"]i)-VnI@ڻ?E2ʛ1'gL<+ټNVwr(OivnuV>ʨa4?t4d?~~X;)כ$}I\\P% }{N귯)8KԞ>߿0@~X_:|vV ~i%vC>|0l9P#ٌg-6ZLcӴtσr.g'm8J?ͦ wwuOhCdɬ3Uzx0ߍ^!nKY&$^ؤ1~LK}ՕuB}+ȱx+s׍{UhQ[3!dk'm喜Vni,磄[5yg!VgT>EI]\bjU*M?6YqgCQaUFF?bY!ͺO٧ƏAlt/ۈPjY?W8=ss0u wܼIxok@s^!ݹuqh8o7AȟK|#7].B/^` #ulfyZЗ7He=.oX+m|\2fuS!uV~]٥Od~ ǻz/-1^eth֏laN;=qb$)xPe 8UE }s$tEկ&SkvYLϳ|]+8[-OʛǠF\~㽯6mO@=%t5ތUCurnAGKGk 5뺱;k 4t 9}۶Σ'"w!O޾V?O`n'ow/7ӫ^?;YyvuVn~gz#I_ (w>2 =d yrir.2U+V22UVdDdy)d!G8yR=KsÇյ9DՂW_}\t_=:n_Qr`6gg 'W+<Ѷ?{_)w6߫vRt`/?=qqqWCԝc?)pB 7Ht?}|~ K<¹Ĝw g|Myw]s*!+h.c a>np1Ulo̲GvTs٭Ld:vt`m1^\3yKj}kk;5yq}:?uY}i0V+Q,%V8YY+ǟ?|8 ,Ͽ^ny[SgrHhym6\:spvHJӦHi),g>U}Heݥ&ꋿD\#3UjӉ*K0ɞT̢8򱠊 A]\շU\߀5rsX!6u6D$9[Pk=3rG΋aG>pܯaIˏsh, QbpÌY݁G>fZ디,QglHdS/xoI-u]tQ#Ufquu:d1΋'R DH'#9GrX;W䗆aiJ2XA0d; Z%4|)`%koìpc߀[]_ը!u.`rdX;' cܛBŬ`#%u7 ("yJZK ظI•=sQcѵӡ{TaPaН3Aaב5Xn@a/lb .]`'0uD 8xص%F0ay;8oGPW+%g YC3Я{1<:GCTtWy 9lnk?W?>.>cv bğyw9n_I'C#Z,&z[vol 3zodqm߈DMA } %3pm\q84<|7b-N!֡qĩ7 ۷072!hH#1N2$ dz#}ctǑEGNr :1ulK j%zcA /(qx):q@ ^7}{FԻ7wx(beNtc@ \h+.Dfi7E>Rr;]2_]o>g6 "N&&j1=q#L~|sYG(#;QP i'4`~E}.yxN8j51¸ Daz!ucs(JʛIDeI!h"t$$ mJܞ$Xb cǂ*l*ĂWǑ椄W$GF|DWCсG>r&Lwow;= A#-NF˙bY;r!w;"}k2i(NDdV$QdN+N@,ꀊ# nt2YҌ.wtX`7̴ UHa"u+|,6_9zj$ x" 0/e\&EJw σ'Dkt^d a<0(1_UaI9/*Ipjy DH%s:+H.GrX'L4$m!$Xīh)&XE&0KhGi|cpҶj1{˃$:OSmV\!,*j,F$Pcѵ yT׊A ̣W,r9#u=gHP/!q (oGFPKl IJ@1ߖnB^T羷79A`1O]5^뫹/,LdJq1\iMICBV$2hٲ 7ȨMMXf&,_K~]-|Mg'CvovD($HNӠ'3lJ\ >'P1`ЭA rd_I^kqz}9 _;1asE?NENL؃fj{8&GdQp0'͎I~e@屇gƀmi :T y&=2Qc!B[&s"ɱ7Gq Ht f(*UyVӅ>Wˇ!7h'|Me6UB,K+<|y}|$s0|6[aa^vg 645't ObC7'N W A"¿S>m)q"xtc( Z+TUagg'ToQ\m*ǻAqF_ bVk̈́Qt dݠ>|,6{ CozrWpj?<߮O(pe~Ǒ>Ɛ\,Wam$J3^EKJG,=BC> - rXUw½281ԆL'Ԭ9CL =2\^Ha PxP#.It~j. 8\_Cr&>7t&5x IA 6H X tnIv$f}C̘ǥ[,AdZh7 W$oVmSeZ|߯$Pϙ*C{W:Pќz8;](K7i%Foh0KNxN.yڻ^0#UQ~2|Ŗ{E,Fw3{1; GkUZ5a^mAɨOI|(e|½e5fqn^{5’h18g8^虩G!FэQA<Ĝ+eAϥ4|Cjdvgv:a"@F," 2cMu%&k\ kYw3b@j) n=tqcѝ߹7u |u~}W~=|U'/߹?*'i3lm`a~h4GrWhu׎g[lIV#:%n_h-n([cP]{d柱ƒ>P[wGtKP@^(ko@K/pd)1]o7%|PuP.)Q%pe'm*g!bm7DW:Bⳡ& Jp1p3 Pׁ<CթH^?KRDս" Wz5 #/N'lX%1&hPFڠI=EfiV-G5A&(򱐫ݐ }`9Af}wn )הP('EG" wuק[Ɋ4Q {ۙqNR.6r:_rY *qxcoNV ^v{1.(of,93ESDL-C07+=CڻTMWx!0fuw| "LЯˋOp lߚx=FM[zk 胀#i[e.\jTf卙iK(Zp=aB\\pvx C!nPJEl&:ydzeMlQull~Nm'Z,o/g>1qὊ$+P5+LШ|> 6d" h[Uԥ V_ӆ"jN6ϝj*3jɡDP`(=ɜltq U,&ǢtCxB6VPa IYm 0+3|tӇH2ɦ@(ɂp|pIrfK񯶸'5O|<\BnF;Nڹ#Opjf1$XuqՂ`8=Seq~'b"|,+D xOK{߂Ovd^8eܲl*!)x'+T̄N"H\D=>xbu7^" |wWyMn hfv'23oy 耉EtLLxΠ`L&>ӦHi')=u&G Om&x E{P-{۳5\Z(9qwa]QӘ}F||7b +vy47Ҙ=2n} u_2o .dc qШ=2@.*ԓۼn.L6\̅3zdjNm_yQzd42% m;sIb@kMyrZCBX/r+,r \\|T춇Ljkr%EIcBi`CN]pr0yF "gj"n# -^R6JW2Pj#N'0ݸ^#/%lб<&)-391DJD]阳-Q#bD^,8^U!;qOޑ;J]\ɒcaLz6ȁPߪ 7{ \t

z>ff߹Эjܠ4^lTwEv0AUHF2k4뼋<؎sl\ 0ԙm| d7 22O˙__] naBJ2u TXP]aeJn]}YƧ藄"v4>޳"19cES:F/7&3ل7cIaJqfp̂"\yfq5#>V7CؔmaZlPuzWnV`r#x[9D/0 ϕ0yƒk#2ר+nx¶J[f{=BDc rѽ{f!I`߇UnQ>ϳ n'!k$o5E_Uv9hW ;rщ_jZ|}<<ƄNt8ּc(bbxEmUBBjMۛ:>diMKBǯrvNWdӝ~-.> zx#lW,#8 '{f|~}>,=)/-Q/:wG{o] '`A#?✑pHo&U9u5@.9׷lZ}ЋɇC0>))0S[ލ`I{{ W "[Sֽp:DAeN;'/hhEE¯FD EcD[$kUH/gPCdA *PW<WImܚB܇kɁ$bAX /ur4bAt^vhEE¯FD EcDn7G\Q1_D .׊S"cG{ъz5;)h% W}8-aD'u9-O,c$Ea92d %Q[Qv ^4)HBH<aO ] ^i+-~5 bJ[1Jt^SEE1AS?}~Óߘ?0C-NE, IV<|4 BWQ&g|Swl ~RTy&fla#냟O6.n&f=qw'C'+]x/X,63 njְ=hmfHTāl]Mְc_6~0g>{߇/v[oRslF5;d2{j~7z:tKv<,B:fu^o!?Ӈ*1{Հ. Gԝ٤,2Y ʻO`lzw>c#Z[FLj4>,ןq|Ғo6xulȅ|>8AW ;d8ߣirFγ/CB%7 "N!]9N|6*<j(3dV)u ó xV"Kx &v}wpzw \B;h%;h h; $ !cv2 c+seg|Zk0fτ* *NDq E<aЎY+?SwP _.H2!H {XS@Z!lLt*NP~ϫuQH5h Qg2*, iF7̓4wu ^2i0Oכ՗xJ܋3H5f >_JBkGflxԺwSz-K!Mee4&H)Z>fh{ h'g(WW(Cyx=)2Ku" |SOm Erچurh&qB@T􂻈H<4[4&4X7Ț m!ĽezEJ0k_X!ry65c$Ε4QQu ƝB<#Ev&fF:FHg(px]N-;8(OQ*GidBy(B{p#&gΉX&"zj s}B<}dT/.ˀ iJRVFҩ*#>CS n XCu3wkĥdWpt`a˯{@ַ?hC2 /\Q'>-?k$6[65@s $2]-1T5]rWCbtY1BsT9oC 2ԻY?! $ :*O99I`^!i1Iְ' fɜ /.D}P"d0,wΟq2s.TCO%1aSftMf6Gߟ}89rRa6&+~][U顒"a2hÈ"Zq+jb$#f5:ǡ93ȶA8ԨP;?z"H r+ eUwH!PJuޅ u][!re`:ƐKnٖH*nG#j%unb 2#NuyS=@ k0(h(;>Jb 2֣w][o%KJltێ }@Jl}d\i;:Jd:ABW$LpI/Q,o5EOȂ9dDx i%颂/<(MY`BUM&yk{dUDZrd|t'VQ{1tRnRkoe/Q *"^\8M2ns0t62Jea{hϩ-侀G Ʀm)WS8u d o;_^ʭbET]C0CAr؇Q_HPB.,9 o/zwB לPCVb h7Zwy?g#24\r0a@ f/)hnMϝj^Vڼ`39*fjHv܈bW%WY9bHO2|Rl P` pyT #y {U,hm S+ ,,K^`Y=Wtq, k~Pzm7UmQwTS9!׫3x;5cF(O2x  Fy̽ɳGLzsCxHc,-onQV|Sɏ)V ӭQCea@D[>sَzoI 8%. 9-Qq^s!ιEaKRFpYVDcʝO!>y'+$PZ/&ӥbR %GgKtV#Pa7O4 5-29lS8eȡlV[*W!b hNȓ؇JtѾ{J|jNSt#, f ڃ!@^0lh?R4w+ԕb4fT;LSI]Yš/, `ҿC/2ģ)9jlaieέi2yԱC(Zu@GCٽ:=E,1 #Υ 3i:z*2IQa #e}C($PK2Ԇ-}Ȓn_X$Wa".zī%mA,XRQ^_6d^%.` ؀yb fȮ&.-Nw6ɹLa|8׼Ы_Gt1Ĵ˰JU4yWB~)!)oWJŸg ό'W̞[gGg?܌sLA2vc謕 Ș׹2m/x7,SɱxU s[9,<:[V: Ʋ?#s. 32yuQ+xuOFa6ϗ\^/9~ ^ d_ׅE5iF>#;>?=nܒ^W\/7|^jW#tbf=Fwّm)xIr"RT.Š ƅ0Kp$S{ȣy©0ӋqsЖp3αA1}>p8"J1Vc l.m?\z 'GfC*FQ{{WBtwst ,6^I=xS7TJ^])y&~N&t[twC?Lw %v| >8mz's!H!AFp  iCfٜ]XWUv̫iY#hby98,j@|-2zIlq*abured+[bf_gwXLdb$/K[2%IgyIǟ-ljΎLF4,Ș?Xv^5vEa40oO{V BO ۣQr_Ognh(B:Pa00MIH:I}{WV0:{ɴJMMeUߺȩ\l>fHt}v2I6铋Mͅk lvU=*vC:q 46/>Gn?/uz’-1:V6F0NV*% T{C;ԡsX!MWU2*N~tCwЮNx.kEU;l燩Sjs#V0ge)M(E-> Ӕc`9MVz 3 / ,!p ꢢ4&_ӏ۷I1~CR,˸LoO `O iS}m3(D=gIr5|]n^c뙳>iR_&r ~2w쵏nҝ?'yR_dvR(;SSy[iuʄbv :bٔ2c 3 SXzmHi3| 4` BM))Rٴ<H/'_J2CiSUqï+f]XjI5_b;IaO6s6wW02 |^[ѫs/ )*ň q! Gs$IV,m OϚew̋,90/XuҞŐp&[t|Ǧs` d)N!S,2@|+}ke?#$Dkf1Y@5Ro2dI ꑱx2ih.JrU]vwy~gدJ=iAz3S`in vKp`9hZEfGn6A~[S (wn|ӊ~sW^O껌ŚlGdOk~=<諻n1./˺0:pj)Bn_&+.&+ ۂ^k(1\!5WЯ#7}8mo >5&%o',!rFK|*5&8뙆#١;99"Ct8ע>qO%|zt|`}uOK91$:N76^``9kiw{MS!)B!#LTCE^⢦ tEK'a(}3{_xyȈ B AkJ0{)) diFdHks \c$*88ck]uBw<y#@qzګC8U{͝#C_o4 Ddc OGtD}O>tW$:8RQÌcvH'GJE%o'(1DGsD(cWO7w]##285{z0j#9 iݮ]D hDXZ2WN汞)K0"$/`ģ.slHﵒ, qᴿs@w3=p5I `卭OXDpa^XqJ() U8q5ZIr)xPs8Θ*_ܞÎ`yST}zlxtNI EDJiNr Mϲ$2FrB0DI%F0LuIt^Lϣ_)zңsn(zUJi˜ b*d 2RheQyB;d?K,&H@ai,Ҁ֤iSd&54ͯ<mlĞu@Nv]/N#,A{,]Mb`_܂.W¤I[[x7]p_ƾ'ZH o_MҞ\*F_ε\~8m;iekM˸$M6lj6,b';;mb~&913k$:bi/.WzhKP*{k6 "I<=.8.jmK~qg 1x׷wkVnc`}~*w} 3gCr#NWswwc?yDKHj(JPdRF2M%zVm\G6z)nL'8nZ,L1 *m(хB=2Cq>r<%z ]BM?xBO:.}btxfGDioBs9F08lzL9 ˔PSu*@螅^f8R|vW͚3tw?}'(/ 34ɇhE08*(BT$ ƺVeG0ǀ(civh Ik2ӰuHFp$)K-WcG FRຝ/R╼GF\p{_h ##28엹ՒT0"\(5i ٜ1m0FyA"r0`╚GF\p `s .,+ f) V,HG`܄B4zIz##.82=}ڽu ,y17Li"G8Uu]>qUE##.8)o>> `6a{r@-0a8~n,2ivh8Ƈą_)y]uYn{W5 !͌nq12 HUbrAt#*B2`{/vQ Rq:vóCB4Ιi&R=R! M z jqatפtPc[3$z ^qvG?^ #?B {b&T C@jwB3drː=2C09:T.B &kr&% lϹ >?..!R LhXwsl1DR0Xe ZhS G)Lpox5Ȉ N{酯Ӏ##28gC{>T 1It]iϧU%gH1+QqbFp[9@Ö-JL;1+/YVd$#W2ӳ=8]ʉHQif>Jh jVy\Dhz{X3ܽn}j "HxazȈ ĘMLIq!Yxp?\ 9Q*X*,m {d'!3AOaI!.=2өjE)##.8~a0Qfxeܭ<.sj?KT{1jeISF(+ᳩI R (4 V\+QI KQC\܍BRAE4ڕ#hxOG/:((C${kC7n* bӈP3O{d'e)VX*ePJ8uVa7`d.JfDXx"I03mffJC.{|rą' G:V>Ձj Ms7R qղ@##.8L'Oߘ0>+\|dOI8zeN_S*5l7)mFQZhHH`{gʓ|=ZnCȈ N)=aW2QӲ).0.'Ʃ+$ɱ ]Ȉ IyX (dkcF[,@5xZNsv |.#.8q>>1cOw 3b1*J`2%f,%hp϶h{͛QT6isrʶY9,ڦѩ]xUp)@Guȹu~a1])͓iJO mk/[h?]^l2䶅ٔrr7[M>?֩jV PMlWǭA~p#~s?h?a連Ci{]ܟ&U(nd)yw5. ly#M{uíX\%qhmFGMe;H?MONmgΠb߁y eGu 㗰+3EZdVn{B2™&lRwg49|OӘ_mukNãb~ ,Ri&w|ZngM)s{el2[8H~EMؾ&ؖMYI d9<<ʮuVE#wwnPF)UzNU@w?g% 栺V+hSjOG38ڕvË;ٹn^?:̃սٷ:yHjUd}KJN.n'.sZ|ݍ௯mWt^~ڂ۠ߧD{3^ 51u>-:5-Kd8ō0 [JeM&ד_NW24Kg:u[:CEF dъ+Iqvc+ +;YvfIcAB}ͷ?hc8-#I^/eQfZbV-A1+F9^ #klWƖV]6 1 6h ~yp6>"W}IiCS@![r&z{e LSNc]?[eCE@r]B)V(Cv&%I@`Ȟ+"#[(S}LWrmj/klmrn2ɽ5UXȪ#+YipwpF);e3eLĤġU9n+Kg fqx : vPeEñF83=[XVXZ{w؂1$ﳋ鈗N ʦ̦q+ HkepK,B"9ґuuӹge m']94{W11M%G.·K :z,LJͰy<?5jT$_7kfD2y,sƲ CZc[LԈÕ ́s|W6ȁ9(4;dA C*zqcp2/N%)d!Lv&nO`.y}CH d$g;2 ə1 jRciF8B$100!$HLH͙NSDI&Ƴ^Y?e{uf~.>Z5ʑrIٚ4SݤKLH&.˪l͈:3aMf6szztYr8ݑS靎goMM` rM Kp;^*EZ.r.g*Ic^{~*b|vR2VӉl3kA׋T:)r3*ehifA,iKAbB;q c3b˼kV9 >x gVGTi@k IQ瑭VHTuLuq'xBcRTBcc@dAۊPӔ({,RaI;pp6w`*!P0Ԛ1 *+d)N#c5{Ps؊.ּt}$1\ :L@v_p%hqc#b `j4dy~}.s{Wgm[l~P}X8˩!:xe=D`l,[ 01w% `B=$4SEws> q*GWtf`{r{J2VS]l_m6|il*gQQN5wTqlMq%n`ك ˯pM+S>eH۳]3;dvNF=u"ҡPJ< B"m,824h("[r=o  lZ RSxsjГ.06./\{oK@!N* ! G>M7;wSF]ܡrtUf~q~xD~  uGU6tnw2fL%,4k5-zG^2)i&V(" |`2aU( 5}4Ї7cTD^:g|ɵlqdO&xY~TYTT2P36(m=l*co umLflOj>sWU`vciqjkLCsɟ^PODuQ\Z[=x;(ӕa^Feu{(mN9 sKw>8ؚ-&`QiޞCm ]-[ޡ K%G]ԪJxESȓ)ġ[vu?x7qߓ.O4k󽏦mOJ2{:wg=95yh K]'yɩ i ^'c;nnSS?NEHIC"]wy_rAw'QNW38Q # ' 2|OiE58A$;AN)1;4A0oo=Ȝ58APFUKxw)$8ֳL&@Q =.Og:8EŐ  5^ ` Dkr4d#E{p0ckuUPcp|&&Eq$1cut!@k"G[tpzX! tLȻP hxp`| <26Z,?!PP%Dk4 ET8X&HGW?pGf',ݾkVtӅt?7 ']|G_Xi+& cXH\;~^~ֹM&%ō;ش'W}8mPbprP#2go7 N/'_<8Ax8ҡ{x,슾SpH{x<&c<A Ap7[ٝ&'foNXWi݌#-=Eŭj`/!!rdrJ 2 ]90!kv(cb#PlC`͓xqKEsw.]]/uJHj `?5=,B#e `U`VAGS),7T0%2JfMIT)˩$O|g3N9f@?.W|o#vuX?&F\8KyT}_0;ceo]0 5B<(6q )2&*?&XuڢDny]bu"n>ͽKUloCjfl?Ք0)%FxCtC`MKTlJdBD+xDɘk(e4Xx &(wr(I8@Fjc;iA,7la3PbeVuV6syvoT7su$JlʏqХTA#4") JXS+chNU5HZW {T G6ynvv|\z/^Xs*CJf{jIMŊ$3 IRFi0!خjE|L$V͂r,Fssju^rIl챹/fVѮR躐^XTcGɐJ$K)n1IWW˅7T4;ʪYї5UPAjo/ݳoDWp&CaURiC SV__7m5=hvJ >Ep,|M-)_yM|<3kuI6}ot]ǐ.ey6]" ʺ :30夫> 7Zz0l@,7ep fA\IeBa!el!ޅrS7[,r+凰m`KFt&'m3xE< PZQku;O Dɽ@Pbr\0ޣ6HҮ,uHp$y]={.32{DyTCkבm7} oK}KNZmnGؑsl\Y"hGߝbT):Aпᅮ54*PbCIE6A-Ÿɮ B~>ҽ 'f]R9E* &@rJGTKL5:"L P]m)W<-{r(2MS<$@'2(C"/Y|?D_9zj8kݟMl5ZLG . q( T.u5O~SX f5#G]A.Bz0N &aݖϜNYǐ?>yGwJyn&Ʃ6ƍ&T'2fǐ?>9dK+ȻyFo Aad_;L< >0!"4M%P cwFD`8U "EHintfٻFn,6@MI x%GG}K˲*[! HW|<w#.`GSD8QH&DcPDyJ}xKoVʳu˅'A;e R )c&#0pv(CB 9- g]ZarC(/hiü31"1ߘ3^ќz >J=$POVXy638]I$I.i#ZFx}JPK΋|:!u"VtS<; vvRT޾.6G<83z6._oĹ`'gOW1/<υ I1oF$OC#~nʡ &f0d2g\chGO1\H#//pcЮיql|/[0^FF?G^y^Z׿YGg Nɳqi^Nϋt2g_;oUy{#7mg=7\ ,HsKZgc.FXS]vQGjs~tg,8g-Ӛ `1@\T`7AbX'F ZZ1߾Z7gExolU,~ SopKl>6i/-{D\5trc?Cƿ'9V\X|b"^gw-&qY~6[ԳB(kh>oa,tY|p2_' ^ljMd{5f㜞zzjl?HǣȨg~g<:j8ˌKk\S0j>B(h| alGKމOc8^Uz;M& nzp,,bRV"?@CAлtv|tD:/9oel_989I, fmiq:L0 r:||hTp:̴p֬q>84l4Mr[3&@X#uL1'R)Ntq<};NPO*eqcOG ͇r)g,0 6t)-ڈ9>sv(CB }>ݬ| gӾϝPDyU^kv(CB +N F&̕f{ǎGB r2Im.a 9Rf!tKH(.7LQH`84Ȗ^! 'F 1}$Pҁr (/-=@Y7Nj[d五!K0VJ L_<#P o^e=i:9YN!a hَY!IENF8=;-o$v*Rϣ* e: 9<"5;H([tUy`'c(-D`kӤ%Lv)-ϣ*M|Q#cEJ-Q (CWGHj >QG3 cᎩ͉kչQi4Smzo &ڻ29/.3:+Ly&߹}˯ GJpr7F:x"FP\oCsGB KA^!qrDBcpL)ύPByFP`qxD'M %/ʳ(.@Fs6E#x/K(oxP#DP#b[q|%XM Ϋx %9ӀbK1(NH P!qqܹKdq}!0NY%3R|Yc*#1(fc-XJyUGB5.r2kGQ09O(*^qнk-J]"0VX`D +O^:;}$P޴(,F))\(ڤ`e&G*49b%h1qRA> (/ a7 9Kbʊyu{|Wd=rb}>s]Myy|~SXzjh9oᓇv8gOE]ei/='xI9ھ(ڏܼx%D)@ㄦ85Z{XN흝>qWY+N[Qi@%W %25$2gtJ L`hRXL{Lg ߓ.~؎]fxM3OAsmmlcPG`%kP_vaFe{krfH(Gl r*{^h&W_VNZV^JӮotxۛmEZRQ4Xo~^ron]ϭ{V^/Kh#s\/ Zk|*j^l! COc7>o,YL?`1}cm?.|-|#ߵgfq?X{SeV9.9XMlTpYҙJňݺT ԛc\fM8*WiMUD船Hw:`&_c^=B]LKZ[2cIʽq*F/B gc*P%$*4Q]T}ݻVg}\t>ՌO)h 0(h 6*c*h hJ;m*]:a{d|\7>?>3^z ڀk2&k2&k2&k2&k2|2XmZJW*DpU"+swG{w}yu :k!? X]k*.Cʉ@jmx_Cg7-> n(*r ݴxwީ⽞IhnelBտGFH%DeNelBM U6&Tل#Sl#Br9ͧ 9 h) d|w4[Q@zt2˿"NT1jypXx?ڳk  RjJkd yI&2&gƀ.ׄĉL< <Ş8b>QGTz%ˮjs9VF3?ngMg0=piP)GݾWX@/C=⊀K?WDnQO+ZڗVPTkkk_gLPP5_C5_C5_C5Ų\yV+ޫk s]!\!wWHV!ӭKD2„%(wܽK/eu%.h =Ę0"v V nV?NZٺI>[>Ztס-JxBӳ]99ziKm uv~|;4;EfvnL-N}6HUʈ]+#veĮؕ<({e\lG{^~|d?Pv*\{ę2h q'4P!r0fGmrnJNحu= c7Cwoqx\}%gxM+ٗen-3O=@[)T P%%JP]/\/\ .X#&LٻWI3b`!ѯ^Xx=^O7Ii((D3Y]U]U]TjpdEjp\Ejp\o$I .R"5H .RM==v FX-Nһ K%7izxTVچhӐLTbϝ>"-3y_{0,CB(34ʀjf&2Pa@2x->d &"2X*gTǜ0 dpPJ!Hkb {LM]F :/-"|ϊY-h9ٴP eл'ewCtîS9d2 vese۠l؜?Z_hs0HQ%EڃufwF$ }Y,w9BW$*zŗ)Ct(#ܙA>^eJ<9tRFf`"XK ٙZ&}|}H5!!nV9=OݜMDi.ǽL{ؗ9 Q/FG;rx/r?.NWB  A#|da~+!kؓ.AZnTA Э);[K;;6fέ7SGS6C6MJ.wjx#j` OC\*ǂWWŵUJC` {S3 Hqt4qU AD߷WM5$@M^Mi2=O)? gWNNn05nk<ѨA~]>.>ztq#sA-Y,(X *i vHH\C`m~ɽ7oʦ~7뭸NfRSO %?w`iϙ&L_Oɓ3OS;#8.c`,Gp\z%HGxS HJVe,:e$m?+"zriKC_-cλV8E/Pנ$J̰Rϭ G,4 } Wy~)|^C?LS.ei 7-W{QAb':k_WYKw)D5;ixhRiϹZ4xVòBϳ˯^^7 ԗxhz.,|:$&ð};g|XuX5f"DdzؙW?&p*8HPL,A1%(ŔSbJT'SJSbJPL )A1%(3bX $(ŔSbJPL )A1%(dT`T2 xZ=bVx mAg"rKzCЁP3<4"B:•raua5RheB^sF~HϷC{Ph{ 8+ID&rH+(Ye ћ +{e=8gc#JηGy"lTIz:IdT*ߍ@t$`+z`/+1gK;]t<xT$,Ϙg |g+e3L~/HU˗˻J.PXsE~gX{u 80?nRT?mRc dMLbpNW8y0_v=S:X(OǺbK=b4ΨVO\`'~;*G;s0oe t7qG?z#]-HG~Oe,<[Q4dkݱ5#" d )/g|o͂NګaaO 9Ű)6:?W2$"cV0Q3hv MCon3ʈ[x{އDZR |4;@< n}b1X3¥YruFdTg&TDOe_vJ8 e;%X|zxX ֻz&B!UwC.wɜJ7+B~?`dz L\Wz$J&kk Z=B嵒X#qX2qZ<_ͥBy裒RzaM~cF3/4AEՊTXy+=ɿb;:|\[X(;ws8wH"Р4hEhԿX"(э$_UP1[y:~6.LCK8La: 4a ij5u*7cfo፪O_r7Yc{k?UA [#(^jd{ 2 D ڭI:JVm!Em_6<逼_FQY0͈HZHG$y`C8IS$SƕтBpBLYd0 9$ gmS LlW $K2vC2&q TG 3(Hj#cA0&p8$iCe$iI҆QKBc(B#oaJ7L 9Xϖ`,r3A5!*{M?ĸ^ 1@fjyD"F;noqF_+W9F8LU3&$tXKK 2=c+Q,TO>@ 爷ζf\YK]7,3>϶_KdIjd>z2\"|8ӿ˛MƳ}Au2aW՛W3jgDSix-B0^[}RUgQ'?;/#цj4[~_LM^s?+ya*,G˘0@mPAKQÄpZAK8h -YZwr36%l ~olj >կYؿ<}K)*0ǍF$]j we,"9KNo%m8s+:':D#K.4z?>@dyQ6?%xH)o.Y )ͮrgyS3&J9J>sIIOM:2Wb =Bf 0$Eni];_&wb5)_KޕRJ*I XMQۄ Y** &/]%ziX~ёNQ noDV g}x_iyyJp`L&24F*c ~ri=\3^(jed) #@1TN@I!J^`B6{x̌;,$cD98(jr|g F^ Co9!El~ MfxÃl cYQΥ^uyݣa>|vOn^#z !x)HE ,= #؂$o|# ZʷQNپ28WÖ-+v DƾM(BGWw 2:9 N93_ظE+gA^Nnɟ'H1 g'-1~6 gۿtj1v"4E2b8D;ta.g񷵜?Mz]SჼI vNIhmEw? "oz $<{N\7L~.>=ƤMmZOިI zZ̖}&_&őTD&>IN{3ŝn<ϟFG"ź|_Oh1X5t+U :Ôuk<=ىd)|kmzCC]~ͱ`EG`˕J!?6\ ~7/nsܹHbmZBBDX+[V+m;?2jںx~O"wx|nvP˵y?IQw1MR /%o.xg?ǎ.LCK8La:lE,XVs郏QoLJY \M֘ZOUxa=(Ȗ3zQp(, +)Q>Ha/v+Cҵu R5d[J.;t@X`Ob("FE7hF,GrE GD:"7E:`!Lz 7$a2``x&a!aX8kZ`eR}H^ٻFcUW} ,6F@CS[.)Ա_̐R$ΐa,q}UWׯL?gGoRћTq&UW`5XKKȏp943ħ,&)ka*ӾsO}}W0KqV^yt(9.kɚqǴN߾%+cvjKOGDZ-IφqgnnRT&UW3.<0 @ac*J8QQn5A8GHT0ꠤcЎbC;PP" fwnR|MIa7=EcšS# S:8% GDp0& P;qho>  OGE,?}{۫^ 2?]E\yOzaPut!倁(Z>wO'({}x[U#\&?#c?/ƣ/IQ_]W?XCȏ3j1GmHc?&V y#%4dq7I鸘4nVS^1 Kk]:NR i48h#͢,UVR7 dKE+~wiX]%=)Q-ס--x \~V3lg @,_~fK|׵WsգY]E^_6p%jw4]Inb[6I4tg9ِm7GŷA|zooo>.p;6B}5Iٿm˻=Z}?$fOYy=W%pu͵Hrڬb֜MAҜ2/92URB?5QA([AnqPϊyVwjLPHmꎎ(Wʽ¿eJ{*~¹^0;l84޵5$kWE;QMߴAP*n5(Y$Py-&HI7!`>(rU\rp-oLL*dT 8D#Ĩ K% !E]RpOVɷ/~ЃiПLP&$8Ð Nrj8u+'BDPT`}V&΀0Y^ƅF2D#Le-"2r챴<Ȅ'zIVĒy,B.hQT'(3VR=Gxim 6l6-3eN˜&uZ`]:a |}!ZY쾉B3OiJK ( /5$:"d< 8Ǔx$JTń,CE}??Vݿr1ƻr,azWS b?a#XXZ*ViO^1VxdX3ˉL,lĹn񍞒; ųnW`=/6CmSH{JZH'7/:e줗 )?Ca<įi:MX?`%Ы FQ ߇_|wW2傍[Eh),PZiH_Bp(DE5ۑ/[]~mk6SS~%)ÔZab,izHyK.Z_-߹#β7oΒop>@JQNYd%98rA FM ;)6Lb#oq|3_ _;štYgW'fGfa(ċ,6l}Wb']zHm6>mK<]Z>|ʝ)b':F{wP*j©DT~eDkw^Fnosf:[S׳;*'te.ݹwԊ_NrV-Im"'EI<-wP& @'}}U+WK{]S*#P:&ơ=mpҀmeaM.p&,yuphX?L@k-Ma\⾝\kyLFccrJ`:<ɐ51:J5 SͲGe$)Q?Ji`"]hMDBjH@߬Qq(q S,69YgPvYp6ipvkb^~(͐4 ?[ "_)xv & N B Zq i:8)G]Z.j~^eeemvt.`0 یNfrS8WQm !zM!XѤy\[-36md&F*hf)*!BFka 5WZ"%R,eQ#I228+gE]<ǣjr1^ eг ٭p^xlvX7֬{6*+C"zHUJ R59JPR6fmy+v ;l W ͗PzfT -Bڨ VJ`%jBo#-$W|=1$iEG#)ݏD\t?&Jo" m%%AZ"AA?T[|0jvvy1-BYrptrur4;􌷭 N/Y;PEm[j$h37W@efjKOP[[gJ۝&nuإثN+'hDeTzweR(#òzQPH?{WFJA/ tYyyh73c=`#OmnȪ"YxI,:"3"O !8l!Ѣ-Ltt[nl=c F^AxNء|9K;fz H-:ӸST {16o_-,+?Tt+pI9 ut16K/FVDEziXքe=E!+ **]}hqNG@퇍'Js'> @onZH$?nɹ-iFǓf++m8qcK&"*-ѪJ& Y_Y4μ1ghSYAK/mThʱR\0)G+R13~L3$RbcF@`F E sBYTRi8XM8iLv4U .=WZҤZl!^x~zze1B=R0RȖ.-ScT^8Gr+ ƣin^]E{,z[C9Aǿ{RxJA,TXUqӲN`W{(uP~q(qXJ(76FJKSU(.HHCќ "cr06=K( S.v9gmLM&D!aȢTERoTLR@2ʴb 885՞T"Y(|m0k@Zj ?!֥sw[RKգ # $JQУȖXR?pȲH#= fNi dg4 c*96Sý F!nuewA<:+ EqI<%\YZϤRt{@0` F{X^$GDC 5Nb-H"e* Y7Ė"`THSހS@," [* ~]`Q M<&ά db}0MY;|bǕwYz\yO.1 2R$]! BO5L8uՀvZbASA!m9^V*\0̒r2k3Y[FqKF/t \4av FV^ms?W:k:but)Kaoddzz r(֦`ClF: lpp0p@pf:6Leu j4 A~hӾ6a苻aP%,<;3.|Q*廁ɕ{^':si04+lM**juqJ;c+eF]V [Pաqkvt$mѥ~[MiLzΤ2ڄ5s 3zju].K.BQ1G EέRȰ, -J)$ŋD4 P1M9Sukk=a3=ڧkF?3[o3;Rhzv#d`ؽ mcQ{!|'Wk[?^Gp\uA5 ѹ”Px3)FF4!k  ꫖{E<dK}l~Í *O*OЍ$@0b6$r{KʳߦTyKH"]h U2{xn6ŦW.~(aLoVF9_O.I JmaPs=0'Sȸ&j\="ǾE3sܼzăNPe]εV_o7λ]K&NL2iDAwQER+ * &8LYg SǔGYx;=ٝkSbKniE+7P{fD фw߬ozpw=ZfHUG#|]v䥝CPwe1DjXR띱Vc&Լ&Z hnDt;#g n>'HMFߪFNns<ݓ~Ihp4hSNLyV3٢+˜eQ>kPبn֟g?iU=9>ݖkFh3:::J9:j'fRH0H?74K"IfA@W.y}hƿL#>G"XqtazpT.èd䢢L 68PUn9N|Gnjlu]@ 1IT#['ͥxluf-RܢJ74n]7Pu*Qiμ f5;[R5d:k{/(SQo҆/*"]1ĭ  ŷt4'57<*`?=(-ќ9f\D5|1ЁSc`)I <;.-2=œ'.!$*8ʝ7\ w1qV{tHba)FqvS{87{m:O~Z|`2rTmvH3=gӜt#ϸ nE~x.P${(*4R ^q3[Υ!JR%W?ۿvl9f&R/ @RSFDD bHV 1ζ-X}ԒnbBd`>tZ,;˫Ηf:X_W߻1O/{ịUh4BkM̨3hC刊`) NPB$x* k}s;HYPuRfH9ebIN k<\(9L0+'~9UKe$"_jY"M5{z.ǗtCrjj_gս2a6 N {4ƥ_¨ddӢ$JђdKk,)Q{6, /`R݇c;L&o٠NGL*$>&E[b"bYf_zg;Cŀ JJ JViS0yD"$JyS]5(r 8M(rnS0 "erbrX+B"sN ժ:tKںbyӣ(~>Sz%8>ejs0G|%T# >OPJ<sM=5?]}r 2\g ,HF[a=cJMUTs̆^C#ex F{X^Ɔ0Ӓuxauq28YVŒ~ @&'rKH*9Z Ⱥxh| qN41Ap0J-){`lbbt\0ct,֦,J&tx&}QҠD@qݻ$WBiK6{aR] tY0K^_C}_']ݱ!+@i=im¿yUF7e6&.ڡ's~w|0-Y67G7XHǞ "k/%%X .XZYh6Yeif`zKmHO#~o(5U_eQ M;×=ps`cz |ǔ50Ł`h6t_C$UԣTjnvd&mC%={惔;&)r>Hu|[Ǣn _2|U_/n4}}Z}@Tbax1!@=TkNm@P=;:\-&13YE&E:_XNhk*u"u2X_Oj5:4^mwܔ|6ɚl:zYu+Bz=o g{ gsM7M>|S=YUAj2#|2=y\l٤$Pqjڠ;W.fF\0kT a^BV?n[`Q5|fW%]Mӎ'{cwW3*L˾TE!5X-yF tP[79i;Hvn]jJхA`n.e*7}Lk{5Jغ^}p}]1؝2'2رn>*7j[Jw2?ZƝNR^*F7'J-A`EX!R1F?Û9' I! A/|Jn>4b/GVK`L7rEս% SH@cN";-$&&$Z5uJUq}=߃.L J)6"y&zGņwú R|oswy٥(M}zm^]{s3zMlh?جf BfUI7a\lxzXsA*E?F7މ\܃~}P}IS&@UJ>E?74*;tS%k7xdr>9HϭxӫXB#f[zVQɼpMȷ= o>r_Bνi g.\_8JYL9eQ6C!$[f0Rx&*p-(ݵ'mA۳zx{8,L>Sd2&31BJ ɤ^u]͇jJŲdB-XXDaz!xO,R./<Yd鐭>/y|[gXb)g?^ΑX|'O8 X/D:m!jG_=0w UBl *X0B[Uvi>X321Aӥ\^p)|0҂EFh kO ™U4*C$hHPɑCFO-}9#?ۘTPM[1(H,*tZ`0W1dmĞ1RWqc=7 ց/w^Mim~n9\ǫݹإfk!TB&>CS^Afs"Z%ŷN{u Fී~mi_,g"_T)m8_^0%Z Ƅ`N G^Fo~p̭!Rmy]4?n_bz򎉽 &ճl:9],}뫸-@ߵezПcj<6+&Խo^K_2KA&8 w $ӽF-|w36*RAM?hS4> ~ݠ{FnsT b%dQ,edK1`68C7e}U{Aye$W }lCc3_x |՜ói M멤.Goo)GF5OGUhg~]wD3KCV4z˰95>Mn P#Pg~S|h|E溴g+ }NIL$`,օ"`THep80vݢ^]]"E# xlh VLh}҃t/'L]!Go_AX?7N~oO;Z݀.ܝ߻ﯣtXg99%ZT*e33+ KRcAg IM `]QQ8 XI(vaaR"2~dEûiY81F'^Z:N_TR9_ d>ZŸ|'Gm7 ,3A~olq3M>=/ffn~=یj@a'J (eU420DMlA.P!ȼ$=R:"* Xq90ha iU30AbFA3 ('@ƖAZl axN˄d`(QHL=8{11)DXҠs0IRY8T("؄NcF1Zx eF͝QFGRGT KIw`v (Ǒㆰg'_C>Euٯ"X_%$cD,V.V`P0B[A()@qt^$Sj84!su{w-ĩCc%S93YGJH}Jw`uz6$p8; ZN8}(z'$FN.'"UR5+ium_]-Q{;06}j*cd?GqxBç01NJ:=5H5&kL GOc|5Ga k{&JE-w߼9^r-" `BK 1qbOE\%h; J*q*ŕ`뭚@i9܌ח5qn܍[-.e [hH33L² f.0E1ׄj*jW#H+'ߘ'¨Ȣsña b CH& 4H *Ҟx BE:TJFqNT`mZ7r!AP(ZuBɥK)42V[x[+VX) d P108pJrbԘuh-cx-6"+ f";Pi>ݽrBDq:oe`QAVj4BB R!QPmLI.0&ke 99q@jc &"*,ѪJCNPa(phSYAK/mThʱR\uS@y;3b/ A<V>lQrΔtp>uӼĽ]qcRvӬ?.ģat4Fo9uL=nf9!,`dS8`ݓvu[#Cװy:,kCZ,g=yu/{Zuv"f6<1Ld޵6"iۼ_h"lv0H/W[cd%ߢtdK$[|*~U,~e߾UPuOT4HH)dAJ4cbD-Me2j$a耥ڄM a1qbU8R<Nˋ S˔v*tPZRK, BZmKظMMjN-{[%u2S:`DG*\30#{Y>淸7Yl]Ӟ{:k@M=԰hRZhT[<= ڄ픷ybJMIS+/=}WGzJsu:m,jPԨ8Oy N F߁^v3B9-!,W :(cęoO8uEfx,v >7R8 ea۪N_rjT{%yJo8̃iRgηWsÐ@|6ɬp5_M|Ɏ`(pN+.PrEaM\"8nҲ=,wx7)8H&1t`u#<ѳɢ )Od'_}&_d f"XA½U4] oo;]LiX0K{ .MlCc۟&P3}oތ^ngV̰?ߏ~3u6uq\ᓨ#aYo0BF/+B׏Wmwh= !{^ V@xP"p@F '܊zb2Ŧs)Ä'tb:rGFFZĶc[gV֗wr<^s[ڥzGFϾ{?Γml%mro$Z3-vYGNǎ#;(v({vh{v04UVJ"-0½6\yV 纶1a=36p<$Nd {ZDSURaJ틉9ll.Kk;b|VGrG_Vs$wDq 2 @M`A4e+\ ˜վBRZYJOԊH,XY:I툊8EF C,nIc ҙOA`т~ü%Qٚ8[*~VdZ^ 11gBafMoP_ILˆ1F"cCB9Φ)"ϋ8u a=m!L^oxf[mEZGBZRS^ʫq+bvi,52HL+*تe[[Ts . 0?eV9.9XMlĈ}.N480D*r lppb!.xpv8?:n?d#zC؎78NK8*^a6I%BG̥) ㈱:W *om%_7 orlNQX@ |[7bQgtR|NꁸRI=V r'@Ji=w1'Ė^|3}ůXards*q5;zc~s*) We+̕꭯P9#sF\jb@ZJɩ+jޣ\Q"1W .s1W Hv5W\1a:#s6烮s1WYZv*KE5W\qSN\1WY\\S7W "|J0\-j@`٘+1!\UvĻ*y5W\IN%=PL ~6Jr9sR| H$+E/)zE~ ;۟mJJŸM]^}"j)EGi"Uo(36S'rvlVU?7!cF ^\p-b֓dqOjF [󤤁LlThJ[ p<ÖyG2/ H# È++V#-i$-,NNHCSTRkTbϽ֓H3]%F X iMɤAB N! 2gjxLh T&BS3,X[)$,& $H4t֣ gc#nO:977Ű>om [jy# 8qPlČמ0M ƈ)vhXƭFxI.:EKTxr-٬+qwU+ѬDSL5~E,Yyj!=Vzb-X !BbBLT9 +PltԕNp k}77 .R'jbR% g8 33}& EA>̣}f^w݀{.oͺM^׭2o~T1Ő"Qw2ȕsp Jgw-H{6|38@\2JrK*7(lIE=桪\)$#MF:Sa1qM

vdƘghSqtavtw=zl3~ll kBmY*K_KNyb:_[yd#XT:$sRxЂZu\XVSTQ(Zͽ'sb^E]Qm1QJ2ǟ%?h(CьK r28 .bthϮ-w^bI3g}|/#Z PӳFG!gIY0 1K2L \smqQy[BrhI)Cuy`} ;~'EnO%̦ pgѸ^!A3tBKIL(J>&5֜-6*|$NTBÒTz 4P&Q2HbWP uT֖а7˪cU>(-ZG)85Ii)+ ΃**y3y6=s^б-UӪUM;ZN{qг]Q5B0RP-{f&&VkԢf0՞S436pu@e0i9 \&0WR;T+Bq9%},p'+8GpBq \eivR.jWBs27zzqNz;yd0|;frXh ;꽤{%?eR.Ӿ .h/QGH^H܅uw+D̿ 7mɞ<}Fl7G_{pOU+-bO_٫`Q[vNjSਙՙ-1ݗY)>D*R#<1Π:|jΜp'GMX_ŕ/MV_8K -@#B0,UԱUrWT@(d,`},-\=\j_BX3 \eqyoR\;\e)UeWO33~F7_vqIQ8[/`+ N%ѯ1_ceJjeHr/K1oeDd Ž Di8j Y(ܽB?xו0k(nFr,DB?1qƋַ+ZHy0K\c IS8ajcs箳Vg/8`*E"`DYpT<)dE4pHN!hq.@N'mLT%3⩣x%3Zl뎛7zq Gu $_=Q`7VwYY?SRVgъpr;?PG&)` d6,P4&yj~2R$HR452<Fs8OlP|]Uъݬ G"h6"wT֚4y=yןZ ipYT%"E RPt)91\*hR}6G9T&egq"FXkpUC1g%1Ւ:F`k2 tko|zٛKP{g<n.Ikp|WL0ǃRd1q .DxІ eF1+X#@'`m"Xuܻ-y^EݕioܶvVvk{cޜ?wחPf ˪TJZ$|BD Bf Zs&s*S2GR9KVPY |Uun/vgyڽ66S-l %_㥆?G*1zy=z9U37hfU̪S5˺jbU8R:NY˫Ӗ%Ub[iPmi?@Bm :6z~LF!N+]ʭ+m6uG(z82$K%]BGtrڹnG;$jݽcᤣ6{xhUl6m~ [1I -ogQ0?E#C%[ǐ-zG+ Vz$ $!ק j'X652;QL <Y="&GW`8T_㌶s6v {S &ڥ+ Jy"Q#p$m@GK%D"% BHҺ@Qޖ}ݝmf/79_lѧ|+,vżo}s!FKE[c6ıU!HE#T"c& mmU Yi42Or*7*KHe=UZ$:d!@(PC#"d+w[ "RL-jC!NiTa~XL=3^ͱ#$XNϳX4eǙ=>w_vNϭ|Gwf ^}fnn]]>{dٚ;EW56 0b2}epq-n Cj|ssl5oz~m\.祖x4{cͣEֽWԳ`p;e䥷ΆE\W/Ynf"fyUSS0lUB9T= ldoPBx_eimYJjնOX K%OJ(~qX=]QC^F} #S#zsl!瓅ݗ=!][F^o %g|%;.=&;>}-$%-'?~3Ry'ƒ ת G:h=S.zцU(pYSy8͸*'ӚK<(&Nƻ,BrkHQl=FÛrIQT ed-Y, Ș%K&8- t !94MB┡E8l :EJaTid,&N72f)I)Tje Ea, O Tl=l 1y⧃7h`09L?8b+kUP+C{IXQsQ9Qܣ d1P(:$pg#9YMWHD#`(QDEKӍn4 K嵠v^VQ[VԮ T%$h"i \ϣFx"!|fAݨ-zlKXձU3^iPST i˼R"}mTBKs郕Z) ȝ "D锨P%9ISd~>@rTlz%ǭl3ެ|~-Ia[fx{] ڳ+jj MCaayj"8<4B[٘h"&Դrz}&aU*gT-cH"^x'l$JLɤNBF!J h̶"&KE L%ʌbV0)<k1Yz_5=ծȊ6Z~=|;շ..gзb,r;C^~5vm7Lp>Λ\fE)rsr%T8 y Қ)RRA(ŜRe@#9&" <[*vp{ɀt[j$тI°"?`K ?oyg5}&E.G.k]$sd,y:\󬋄n V'P|0ma+IGܰo iAd&kC%z8|QMFC-wNZn-^4&ڑۺ>x'_'4A|qhnNwÆ99ޛ.`gD|IJ]ElXxG5Xd0Ʒ)+e2c5Hq`xXOh Vu"":)k_5_ ݆M@ oI {fSf##E D2e0ZE`d!䑣J|;GS<'&]b͇駛3ᷪMU-7@ g4f8>HxSWR"{uY'دy )auI?MF v`WaZ}`RGTQJmŌ6>VaLg>#-FH߫ +vCvz|sc#U[5iiaP;D0iG+%*(^`;"Z̎=r6o"g`fP VS ?}  S:O]?YC J)B<΀?FZ&2BMPXTJ@άQQƵ%%:q<:6h(xƣEv A ?]Xm4 N"(Qj%\{P@H1:hyT7Ν oSr:_2h.^zދl%sp [4fK9کj#U EDD%ZU^1!8[rRBTV|(C!x$ ) d+k5f,`Hkzl"VHKDf|df9/N_?a'v,ojص-x-&l0Q_._h3aùmznO=shl\kD$mv0QWݘӀ?%08[it5Ok'[WƵ]v>np{nWFu|oσGs>yk.ur,Xsͣ.0mm{)˗R)δCݿvs+|sG͍fP"طjIL$`݊JEb+E*\|T\7en~s.Q}k`AL)2Fy[z"/tYGy}ز?o/VgXg99%^T*e33+ KR$$ IM ]QQ3O>0cAn$*Ű[)ѬZa6rE4\+z`Bx=xv7کYrsjMpf" $=It`QQ;P GTRi`u&fJF0G [p).}Ԗx93Tn͘f͸{.W̺]x"mˠSOnɫ}2>:qdTf03hmpc$F FK&\`yPZZ @@ۯS`gHZϹ% Cz2؉UF.qVqC'c^A9AQNp['"9p4&>FN>EsRgsǾQfֈhĢs(+0B (QHLd$&`u4(%Ls G,G~a[CEҊuZŽ h;qTױ]D%,g"jH.fyZΣe%45wG%UsG=*5wzAJ-rGk`03}:/vj fhI䳫H3o+A#5ҀAA!+ݯ_^fI^q2@ 굘Uajn\UBY*ђWiTi CmMڟ߾n=UPKfl.+ذs]w[mybw\.sºi,|.1q]R6 EJ>+KU,V!]Ynu O2jutr4;TCT ??֯33.17*VO6i̥ˏl[u\\]b/}׊XwLyBm/NJ," Lz>\KЁ[]ϑ߼X6|n8ň5Yx1t6eD9{\:σ@#}ݴ%l<PS7"KFY˟כM-`㾋O I5P<Ssu_S2m">qs{P3E9l"D4 ]f# ɎC >z.ԫ%eL̾wx%]@%rJSVs|JT3֊+ Xcu%+*QعD%WE]@uER}@^ }YFfu^0S 7<|/x31݃|%P;JϘ([͂CT7п͋.g~aT$` 7qOıOfR=.ޏ,XUͱ{Hr[쯘X.`Ztƣr)L&QKŹ{4T<0¹RW@` $ryg0ZI]]%*-+N&]BW`ѝDD>hKTWfRW@BqtE]%j ]WWJzJRA)՝QW\<4E+ T֦c@ӻ_,#ׇLM΋J17HaHȐDSc0 U`Bz8(ᰉN^m~Zb1Cht\|fiZc^[/~ײ[>lo/j5\gUj+˜RiT6JU!^٠LeE+ZжʼnqX/}Ժ~?FcVykl**7ROyW=euz ><Vŗ J:%**=Vڧy>h$*8ʝ7\ Ø8P=d:FY4‚SFNZ:분oii0ظ+qjZކ`ݾfނMnt5,r}vnlo~~n[z;%)>5v,_bzHPqcBu5/ɼ+|fK^i1c7_UתLbz'z"CƽIf!x " #@,EW:U)Av޵-#_Ld֏3;2/-KjQv$u5hf3:ڲbUy D }q5{Gch#3{ط^6З  [Uc^9IjUdF۔S_JnX ZVZlC D!sLгm\鐥&MkDg UDBJM=H%5c/?/{jqS +2Oǿ|]An6WM%: vKUZr ES0K0} \ZUC#Dm[9EW.ƄP-)rչbBI{o-5a<5ű/Lmf7_'+vg"Xb^Y5py]qP/,T T\޷R]حW_ ݭ?cuaRds{a:v0CLy|1X۝"oaz*ӛ޼g?X]:M.JwJvyP/֑;bJ'hz1ɩ0@?%.M&8<̮ͤ@@sUsWӳUK'[ %D " SU6kR8D Q[MB9PCԡtꕖN%[Ά$iE"U)u2ѓiA墒+[s`CX z?@El'poyšj/֫oǁb ŌZog(U_uof+nm|;G?a}Ev">|:zsI)E̟n6&_2W=qJq Dĝ4Ϫic=Eg''TN{afi fGUmYh!._O\.!27D!0ryOY"t&޽p`♃f`4^#NOzTJMnqaoG9(ێ닾>gzy@ׄrH)nׅ[J_t[...GYlZt65 q{{Et2= aqb=^toz[wi/|b!Ht- j{_ZVcrXޢǓu˃[}u\CLb䔱!iM}w['Zx*Qld'6a AO߹pg6m,qgi}<_n3-ҦWUz==З}[z}XL+qԞڽ2Ȇ^oe,k{7,kMWc*냊]ݙ+CgTgY[|&tPf\]iqyvq;zGw-ߪvVlʁS<cصݳ?^  Co7Jre.'iRQ7'khn*-&>IYBNuVw GSf*)k4!J 0,t Bܼʾ}\ckkj_Nncv#Z& ztFUD45ѩ)EZ #c}g RkɆoVi "э1-.m(Ӛd0:\b+9%x&h gRIUf^YCSF)E{lM!xx2 y&xB.['lQ~Bm=BP:b&huQE?I*+C*KxV(s.̱Jx_ا\|8nΚ{TU͍TRmZ S$-m(՜ȵs2JwF6Y!-ю4V\jH)ťQB/ g2Nkxj5̸XēvBP`%>]ІlTS*tKĢ"7%bf)dXڲZfB !!oG LemTȗji`ER4}FnQ xgXtx4skUsTG"? ('U,X|պ7%2xBgҥ1$6b)9cK: :B2Z&C_gh}\WWTCP\L.HJc}!UP!X)J^*`P|Ojɔ2g, P4[5Ҭ w5T@Tfv2f`ys]@,U%C.:TzeC 1w`ڧ;hCuD!%:.h 94t'`im5/#e`[{f AnK)A2vQ:t k*-^2I[v(g rQj2[h"tg e#%;cE1p 2#*fWbeE@@!o;k[0a( !릒|p]5,j )bl)'Jb|nRlOtb#C[Y;IC52 h2+o, EiۧDU^Pڬ J( 棭~O@g Đ*6mz9k.~Y{Mii\וI[&nnx"JN=)*5z8mÿ4Nd4^[ۊfM6$󤑲ZhhmIf c0嗓 z(͈-(:1h7%<%2`z'ȍ44(JN `XgRF)1#K[`L9PX 76u1l;b jV$4K)FinV1;Xx[0p 1]h#se H,ܼC`Tu@ 4) AmT6EU 19j;͊ZXCXAZI`%较^ub2K!e3\r!;oYGy~ThL6K}͊Xh؝ k D @(|`"ʙ9 XfH %=`e2}^H-O38%0ƇJD=T{eqކ`PMr!`Un. / cV0K`RpIP.H,pR&!Kk|ugWy"SR^vvS.o-pH5I8b,Ԭ j!)9[W}hݵ -j'OW 1*_fu!e'ݠN-ҡN\xn\vo X+=3~d'p:ݓaz9}Ј'p!)  v&AVږܒ`$[uLY֡:9 gs Z}9Әs-\{Mraǧ :_%jKd]fi`MvkR,߆lTMu,֫5\8i%a~jMIv2V&1bwNa&ųA'kW~lP<|ʕ៻{?2 jG՘E~TΓju߾u%j]]zLL +(Κ왲TjS6R+-S>$za7R5l^b Zb IG"VIH:@t$ IH:@t$ IH:@t$ IH:@t$ IH:@t$ IH:@totm_._NG`ƿ7Hv^ =<( YvmN@19x)/Aȏa5A_+H5||V?cتc>o ߙ}w=&F8:Hkv؝lISP=7L{?osm)F·Bhl( 'O!)R 2q%(R:ҁt )H@JR:ҁt )H@JR:ҁt )H@JR:ҁt )H@JR:ҁt )H@JR:ҁt ).~hJOhlk$rDGGiEH!w `[x Aj1"kG{^?nn0 Km]l=~aa!/~A4st5kzy}2P*'M9ُ7fW`H]9$O@$}o"YzW݋$S޿8` ? }d0zkRgz0U#x+ ['tS$PCHfl_ !lyy h„_{y: \`ЉLg;Y0#ҎĩЎ~^M: 5ĵaiJ1ξ^WtR ^f̹>ԃ#.HΤ5!EdfCLqr]lf ec](QT.@HtZY8IRoz;;q̫TQ uB,GsY;@i9E*)[XQ$ C|TjShkLd {Ժ%HtE85vӜsy*ZWkZ[F v㌏mdQ$^.jxg ")M=:7ՇEicIᗙ`"5)Өy6j0AG>tNu67ևٮ;~^U1Fl5?ՈFԨQ#n2CS" ZbςJ`j-\+%jZ-RCjfl׈Kuqһfd_hEzIY]`Q4spgF兠ci9`{>oW}7}N!݀ݎobo}#AF7o}#AF7o}#AF7o}#AF7o}#AF7o}#AF 3^Z,Zooүsi@ p14_ts/eqaY'.L`G"q5M\z'qeR*}ZKK}􌁗H]5!F]Uq5=uUuH n4oP]q.g@`٨*<uUUU0TWߢLzyG%g{0O?}w19"tuZdR5@d¬x>[~Ojߤ@р^|chWM`"!>?\\,~ӂ~l~>0CZZ<^]#\NѸfx(#Nl>Fg˖؃YZ raǧEIzX- D3 -;z~}]}~hů2~Ȗ֫(50?uʾ'KlS&1bTΩfl8N֪?]2#ON?U[<;Q9O^3!?k|Hz p;$c VPW&{lR+$%͔Zi!Q\Mz{kw/wӽ:[So%ʠչYbܛ-7o*P@̈́9Øo8Sb P$Yx V=%/Aȏ&bcWǧ||V?c*/XwDI> xp g$jaMJ>2cWد4M̆a3qۇa0~rB6"?BX//_>m8ft=݅w]ͣ'!76*u}1X*Zw_[IpG[:}lوͣu- U۽M;(\Foovu6]GyÜߩTv]WbԜΟ\586oJ]u&^Zz'L칥3+yn7}M%i35${koљjsRtevUkO>E^--jq0/ +|5khzqqcr𼃽-Ú?gQn1(Hs5cB׃.>e˜dD&x/}櫚'HKW\0WE+޴"֊ל} I)/|ԳsS}X6>~ɊـIF3Q: > 8Tg}c}LsgU1Fl5?ՈFԨQ#n2CS" Zbqd< *a΃ `REtNZfDc[ E)̆8;,uJ:͒}m-Eԋ'Wg}tyE0,V,oS+p!ψ`[vCqBʓ'؂R$~[!ռy4F㏽H-N*(Dd>p?>P;|+gؽD9Y;{7~`C_Nhnon@ˢ:ys܄=[V'uuFqufWz{0\(d4ŸHȑД8o-*$h w2h"`N+Pd<1Fy`VTcnss&HZc'&&JC̃Ǐ,ؒBz'ml0Zg)]p"إ˿>ߖ=Ny#YGFBc778'u4yt$߻{pp_x\᤿wy&$%~~>?ai zNN#k~ߧSu*;^ިP<='wvӘmw=~ XvO{e 6)7yMYGnbU'+JUxndYSHfTppUNx0٢W=Xg_-Ût4+PIjɤe\)Ú.)YDQֈjUUg;(X=\'-|p>֗ɟFz,\?BYa:w<=lmXS;ϧh]-G20<i".{KE2 m<$!tDuNșxνSBsMA2nG˟,*dC=wLs& 88n>SS!Ȅ$멚JʙϙZtGT 6xcL;g6;\UDnKU}D:8WJ;ٮ/DoJZSLrYdzs=M9@c n`H_Ʈ(H03DeIܐ G{^}?Ysf֠ئ Fݍg6}u:a=ujwʦ|O7Wmr낙enpݜaFcJBK:'-w%cWsX%)Vd>&q,/)?;\քq֥-9_/Q!= 7Xs?2#p]κo\ޕ6#bv1>g'j׶D!َ>Tj%oV-Uy Pᄊf%Z"xئ_ɏi4~4 4ݰXT^^Q;qu R!$h%FGdP JQ)տn˼g6Vܴ݀gɌ6]&QBҗb *+FZB \"j S箮 zuyS41 ׿}ǜNz =5c|t7uLz(Vo KO}0D\E  6GP`3 e,< AlyK|*KKxWUCuLO3>y‚>pJ8K$,lTaAhп>e?Om/ވ,4uՍN3L%%ZH(ƀ’;>dt eJQ\$inE=g z !w_ )кa[ldf&ͳ>< γj v,zӴ/`lfғ.榜v9ug_+k[{s[-;lg.~c8_YEcgy&p\^ʻ6_+H 2 mќ C%bcuG;ꢋhr !ꘜ9mr[UD~2e+TUPE:q|{fm9U,:7+jK= |*OU6 4z8%Cp?&Q2EW%<Tbdf*ۡ {܆lCWU+gזP{͉)<Վ'Ytf6q.{GgVASR߹9( 8K5Y8.pcȒ'Ͱ*>4)* dɝL)r> BjњLp6r=vBWƇٳ/L7FOm0]`Vx2k=&ѕciݹڀ옳`O}{7m캍]ҝHVJڬUs.h 'o ?7+)jkq5Iٹn[G3? ѱi>026f%&1qth;\˪..֜?|ɗ[yon+Ӟ 3<|;P\EϭOt^y1 svPk̹(T0k< n|bfvKFҊ˥2H~ntw>GwweFk&LzŠm`ܕ-gd,/6a59]&O HoXnZ?&_Gʤ+4 W43. 4&H׹H%` f的"ېTFI+^ɉOrcT#^Eo PyϨ9E4]jxdBxu?WVKo_Y};vo]TqZywe| $I/@Z,$cH mdH 8UՌM1{rՔ##1K.HRl.EIQs- \&%U[3V#gf܎RNʷgmֆh@g|T,L0Ym*kjqry9:qS"V֋^/zEŃ0za)۲Lek9TOQ }{DgU꼶}NЙ\f?lvFeءc1YREg}%CiKQ\OvK|2YݻvGc91<܄$ z9OMvl Ofh$h(~A2k|;~AC0(u=qD@nClj|Bx P%#yQ'.R s S9b$kQE$%-)ĄH_+p'&g ^H]ڪGTuH xnFأ#0QY>B *O&u+$<&qJmBMM%>=[yKZ()HSFkdV>J~yT$-d 6EFT/>ژ=:(D0m*fW.$J-*!eJ"g6ls PFA"+ ,9Mc%Z8gd VV#gZ&cBs7b1Bƃ}Ayq -BlܗlBc'%gaNXjNN|710p"DU$).%Urut U٧R?ܚ#*I]$T9ǡ[P(b.V|~EYȳYdKvHD}j+x 5Y"i+lW\{xޣSl81bI,'ɣ̸4s`'TIJPni8JRdY)_m3d `{rdCQ6Lڭ;{n4KhD>_\[K&M)_~liͷq??wG1O1 1w>¡ B4o9۠$ri o=|`S|N]jR~%|e|;~ըfϴ R$ A)[O 2KO;4y,i~5:'ʧs~ǻ|Ov#hr9z)5NCr&C0N;IW!; Mv̹؇.D(두7B珮9=l5jaD槥 dž9}-K_Dc0{/X4`4>3}?%}?~e0[x?vD=g|]{zԛx0^]^ӌp&2M o (NJ vͰ VɃ]It,vVK!'d V}g})]^zr+MʦI.O~4'"~\itEfT<#yJ-wu3-mҹ!Kw8o*(OU0:twyѱL? cCI%`}R\[^09ƗU[2KSI!Df$}͟v=Γ?յ%#? 6s$c./$դFjR[MjIm5&դFRYi$kI@դVjR[MjIK}ێ**#iL-M~djLMi25M4l4MF6&j25%A_jKMfɬ4ԀC_jK R_jKM~/5iKM~/5&䗚R_jKM~/5&䗚R_jKM~/E&t&KGsRS_jKM}/5ԗZ<ףPi-iQ^{0ߕ2a3LΓG)΄ߒw7*o?t$~I*d*`,ۨ$RT1923%í,aJHF;#PEm11߂ו;4Ӛ_-m+kg41heRI>c Go;goOa3/M?o5lNbWos,FR#~W]{5GUNk5;ZiNJ]F겳g죚^*@4>E+B ͬ#::b 't9Z#~|5AꗫI4!]S ?<6HgZ_jNP:St s)T˓dhlE:QldU.pc%4YF ZU7tJJfa۔G>刉b:sHG4Oi/T千jlkO!Ct*̓-ܸ|3;U5Y +xPYxJ[1Wg_=[WִYgqm>$L~$3Oȅ㭏FZCy6 X˯]W߼5o^zr}Fk޷9$j뼹ӯ 78nt}w+pqK.Ŧ(/_ߺ1^][ן>5QCԗ=77tY7wq2a?Av&x_jȠ h]rFUgJ3I}\g'6U=0xH{[]x@(3#tvQ>Y]r>x)`4 19z]!a@:jHHCW]Kȴ* lC.y-&''c<-@1ȕkf⪙yq*G,6.SBe~rW뭲}zYTE E& 2"+5, ᘡg v`= FP72:ǘ=9K$`b.h`:fEt)W82Tmdffd\Ru^g< 2b½g]/+nn^=< ʟ=йpqx5>qVZNƜCEb"X%$w,hY Eb19ِ@F(x/@"„Q u&weV2 wsz.\˜͈Gqɹ<Ԯ;ڪ2jˆڍn'4z J6VYdG)`_]ёAx.x#9L̑IĂI8F#?ꤝٌS6-x2X<ueDT "p͸̊Ĺe&9\.0>O@GT Ť'\#8InF 胡/OkH&j(m+VFjlMASZK2. ת4Rhԍs6(} [Y\%M$0 @pqx*xXw<me<| i?sNŵ9Ih P?T,hByѷMp8*t̒+) ;mCAHO%Ul1Ǔ/2섍ܔ-~g "=lH4wF:#=b(mwߪ \U\W8T/eACxLO*]ٷ3)92jiM CtR+e 霹 Ѡe\Ihȱ%)-lG/fSɋ6h[ q9\xC#uCut4~q<kPM"QP;LgHzj$i~q307b' WCB3[)_w3M>iJpVqr仹@|0YG#JJiis3wV%˹: 8}*('CJ!W`l4vኬ4v#\dOKWfLuO4:r*?RE(foT8NS@{XJxB y "aޭ\Z\+绅lI0x]c>RZס`#=`5_\ 2Ot<+N_~zՑZ+ZF+o+ZhmwjYTZv-r6jɃ4^Xȧ.#LV;8B"4@QoԴnO.UoߗJX+˃ib]Pq)'^}H{  B@9h9-]0Nw9lIrG8}fpsFc1XqVHZ0+ FTBZqFpE+i:\agWRIvFpU >b>vUኬ \m8b`-"s9WZ8b !\iͨ î׮Js+k;w2z9L^АzUXju4t'7ut'At @`*Ĩ\sTdN:iò^(w=Jm G}I$?7AM7(!Do^r^ynH38LB,fedƀ ڊo-t{ bB^ټo6>*3. N/@FfEFY"X]4V(کϠ(VfŠUJ3\^]<2o8WBWYF[$D4.9;Qܙ,|̤hpo S"Y6Q5Y=:EtYhX”tqoaO#G|3#V(9C[; dA8.M+kyHȘʈilXT/Dqx-T Pğ:bСc ܱL? ̣mj=.N)-`,&J)rTDHL\јʦ:>u%@%%>f$\5D sk %zMVA%k9O]zi,ڂw7rkOLߜ"8|H!9N}dO2(>Mmdt;K|D%p&4r-\'kċYtA&+N,'bfr'D} 󁹨M(EcUX !tO Kk /lMIW6Rͷt|qܒ Y+uznQ#QL-4s[:BG]FQ|!4̅3:yLClg<:riuJ6Ȧa;s1~ʃѐ|>:Of`6 4qd!H)XPz"mMl#I<o%˲[RdʖfbTxhYD٢*H<܃2hf Jq&PΤ ǝQ1xΒj}8\1q`w~ _[ϟ?j}Kaj-- ™5 Ƽk"8qz`[,+@;}rpD3 $LXhK)ϻ뭰4ޡA1F)zWe 鴚_'Q}H}KdQ_\$KHE㵅q۪gkQs=~. 93O9U<r:MyNbզ) iߧ>_#u_IxrBH^zF_~~^d˓m>cheroHG9ja؊IG]i"uqf+I S[4"`!`>.}lB]h$$P?eV9@4VS55"H&aIIsƈ>_$6q2>05cBazi}yq72SGP 96IFw rŽ.ѩoB?ipGxn$nO8nFb }*BBp2'uttz(fY|JE0 `?ֱzpikSj[xR z8{) kT^$~sN!qrrjnp҃x3p/͝7ofg0;/(OϚgEhfu"3;ZtyBZ2%t lM츷eUy0ѸQ؃L>_;zzb8[rVY%Z|WF9#$7,P}1\W1=qp_E5ǫOߌ·é+QuLO.t*Y*ּDMWQs{g8HoO{o99滷'}78 ,!"  }wmZग़Ms [4Yɷn;|s+wv%Pb7.mQk}XVW1[S8lJ1

©0Fvá9& 9Ā$ke 띺ǎ-aU?)B2=pzt$9Z9iNDdbΞTg.AH9 8xUdhN8vXIZE};u)+W+/}[1#(ouquVh$*T]m"';Fңb y*Yo) 5Sph-QA5&S)%D O%$)/Z )zaijb (w[ @HF%S 1Y#gKߧ͇vO3\!y/aοZG[T:l71:s.~;nz|9l)ҍt+B!U6ͤ>eyN*KWM邔۱w2}\SMJׇxԭoQ.(xÕ<`EAfKo&PPϋnۖS6!0O3lCЗ=7"\0}sўD:6{#pY#QUZds33*]e7粳nsZ;֯g'OGHn~~G?\y,Fr}2o_}:s|fE]u\imF#xJŽ':}bV袡DkA"i  2OQF1.@)E/+&!O'5598Y Sn1=yţlc-.0#=%iɹG$2fƋ%W\-(h-jV)Y' [kB2N AR-lB#U2ge\R攏R-¶z[7Uv0F57WO~v>@eχn5Je!E˽>ﴳ brFE@w,2Fr$O˞>5Oݐ?1βRy b&&1@ɌL>&WP~.&fƣ\.ZmYjjvg-YsCxъ"> D c aMXe,j* !cЋ<*5!/YdK1%)?ꨬ-lN1ƃcW [D["ݐ"bXH$"( Tx"':@4 Qz@Z_!M чq3!<(`)xTGO9qޜRL-"~jvq:Ot:Xg1-.vQv4>'"1:j 8 AN,,$(sVsf zx(vjMa{hv{0a23{\קC>Q:**YOU3>k! $@ZIB"UN9Q品*Mił[$@2'@>B)p'YY6.Nw@f=z `ALW J$Org4Y_OBCMw~EIEȫ?7v*aW ҏb<~f4z5Q>LZΝox,m93dp,2p81X㬞oFf/^6_[ x!o"@mm}#-z8vmBExN^t_XA8նv7rSs~UV4l\ ?3hߗOWYˮ2K>6WLaSn^wOYNS^v5ß" t\Զrh-GKkΌgXY)Y0G !P+c4c]9.:DeJ]FZ5  4q0;c$:'-(S:Jo)%ABDPsE̙yt8 }M@kxXڅ-H99\Ign!:ib:>spnȗڵJ ƃ0ix/O $"me/'w_~gZȑ"6,sv,Y$mm#ya+~%Ŧm9tj6b=vVM3tݬE+s?a6C Ǎ9'HwIDD`H\^J1pU99@$ǹq0RyHNrH%>ϭB/Ubv4n5rvsF /YsW]g_}m uE$C^kOY -#z`#>`#>@ KǷVqFd"a8i}C2@vKyЧw!ݺ=*P\Y]g 7i6ϱ"3n=fkxCV!,"/1"ϑY֚ Պc.T&L@*AaU]|]=lnZjн>. f~6b~u7`_\_r#M't^ 9IZbIZM1C"5sIh+k.487{`JQXhO7pR*3&> @"@Qa., t EpfѶHwIZ? ^2.7&s# Y1%Rte&qz#izϴTY= k6`pa;UDN|0d? Ԏbk$:GRzHÐ9ե0Q4eN>&zv9p"Q>%F|WkJ2K3^yP,&#MN[?ĵiোߍ: _VY][x1_J@*Fr?zd@s4~X$bl-T9U Կ-^q8^OCۛ~x{'o|G{h6&{=5iߡ)Ykho5 jrwrǸ1aZ!"A tم(q`67t:u,u1v οEv<$$ax1c&βN+*zE-+Ҵ!YVU(b6sR8%TL򁾢d,[ZeCUe0r@`,/BB5q ;fuw ΁/O;ݗntq_ڀza"OmeyhmM˪>?³~9/%>7rr޿z.Jڬ89mfGD`$Ժ͢i:dzvH-jۦVϗwwGWZn_%ົֻwgOy?e"ţ;:^ _fhx$_Wzr4WjAts4m3H}ѓk5^7":8@|[) *RAbI,4EhLAUS?x JR"`-Gm0yPF TTJxV;D78,J5k2 Z0]&1)TKDs[.ERO! wo1y $VH-7A`-].95NKdVC13}T%I)%E6\q&bIπ.]dJ[⒕ l@_U29Hў6YrA얰$ٱ1P[EYyFzεTmXݒV& BUY^>,v3(x3I{g_f"y ?M>%yc:%j%[b: %EW$圑H duKIY4eGdB"6h0h6Re2g2Be];Ly(R긫6v`x̄SU"T$o'| Q H>9TUfml" F抴hɒIJ2Z$:Y_;Gz5rvV/auaI7D>*mehzKĭV-N3'be.Yf$5OF 0J9!.Bt##8i%K2eJ"gB 4x2I eea)%Z;io-rvKOg>u 8[y.*\rzm+* ZXRs/Q18ZY\ K$$ B^.>\<yXmuUM>,)7mNpy?Z{?.}Od, ů 'X@X]k~yEՋEՋ?{{i*ta2r/eN/iq~{QkSJ}?*ŕh4'h)3LeeAHH\#gZϾ//t m qP`țD0$ߊDCM;v]lt?ѽ"Sa).ťW-,:K+hW>`!t<[ߗ4?6q 2h@qer1]z>|oK_/uO3PY{a߼hm֧NzʍRL7!oQZNV&=$⿓]CJ?KᐔUY"rDY,+9B+RI''U#=Rx^pLĸYD &hDQe")YܖrBmZQ@:*yȶڭ*u5/jGbO8]Or5]SڲX6^{es@<:mb68Z:^KK]РҮ9isRiIB暠kZ(MУ-U. չRbtUWj7|2j 5!w<ࢊx:gUchI@;!צfm~<40Սo]+?k^ӍU;uC=3:bLc}^<*F2:듀b_,o9mW;| )4q2ʥ=e^&kdK"H(u㒇H]my}89pcRBNK# CA{IpCH)j FBAJ( ZJYr1y›I%L 9# 2>3WHX=L|:l(F;tqIn_vneo4Cz1?[}1 |fh:$˼p.NJst71a@@.AdA %:WqLr< #4z.T(SZ!4鉩2ڱq7=$]eE-9m)azEY>fIxw ?yIF6vFFyhXpUP&sb ˯B>_|4X*xcHob BpC([0QĴő 5GkI'#-M:"Zr&l(Kqo"K k,YEnuS }:S+m}fF+|[5f4g.vhD7<QZ3] ⨔bU:JR%oDQACS`AsI*(P75{8BR4F$yt*3"#WJd>JVA$k[(3m5%g,oG1b PbN>Ԏ9\~( ` hJQ޻$֖*ȵMyp`Skw?t2"au >9ˁBI+L!j-tp0fOG!aBؘ^)nJyaHl0mO9?l]VeeoMU>:}!y<ۣtTd]Rw9xfy aNBH'FDOH,9K)Ip{H'qO%3b/z:EO/y5c޺%N/SA L0iJ1K*uOo"8~b^%TLM_>L\'m:xpYS"R_o(" H 8u `wv.%>9#59QB{ozlV_o/贺яkf?j8$ԃڑ]o!+K5@t)B7<_x( Y`s;)hm0112ZXd@<l0^ ƗHdgXnɁ#d^s`yL::] zp>ъ rhDfwi`acw3XZ4g@3ފh:jzo ʘOFcv; 'j6ft֝;R5rWrf[E|{PKmn\k%$shdVQ:rRQ1?8&d;UkY)WaO]ч>%xBFPN8tS21H)M@  špUӍ]denG4V^aHd̀wUph;pv%y"r9`l56GT_oƾbczsi LIq. A)$95zoGXH!&&4|/liMk4O_;̐C0N;W!SВƩڼ5NƩ8RTjJSq*5NƩ@ +7k06F߁.殀1ӂAR08AWi!)B֫]c Y/~n{g Y|6O<-DԢ5SϢg7mٌ0&v"Q;+pˁ.xڹ_,Nt߬c,?CY֮$1K#~=<֊c_>,"9Et c/M6]`dt6ɵ:*BՄ">HB]J\%zp x*irN5\h9Z\wb5ql3+uB|:z>ze˳:o/QӗKXno\w>6I|E}N*/9IR1a%iS2s\jMYB-`%S!l\Z|A]_ Ψ_f*o1#J@0L& J], YDrdIKU,ޜ?jc"=M4{gOX!G^z!\,RTcPZ3QvtG0j,?OjƐ\g((C 腡RH6$\yEn^D+ < !5]ixWΘxK?@uZJ308l~JKQ'N/JmYPIX9oyO&$"D\ŕi6crL%',wkTaxB󻒅ww.+S& c~GQ4vNqt}rP2}Jdk:޽{;:xNXp(ija曰 Zp6}7`fL\ѻ BWj7`>ܺfu\q|(z7O} rwedy "RWq7?~|Zڕ+UˈPrEUHf|QS,oAz7kz57ǥ\`WUWޫ1pIòfRV>= e!W>) ?9l7)21toJo0ByqNob^y%ē8zFg.$L[T֬UM-'w>|?|~?ޅeB$[!U"QX,?o42fKK-FQ|-XKj gq !Hnne7nAz)X#% { JW6 V3I )j)YU`\`@(=ua.|~?ìた%e$Sȟ23O0eW<)8dmT9tbd;|;,sa+ n2WgaRL#"c%Q!c%q0BƪfXIi22VŌ]ɗ ~ Eyb(;Йu^X"D9,~UԂd߹@袴:K5Y9nƧ]Z;ǐ% YEifu{61z6Kr+3њIr'!U^萶8[j&|HŮ Fda,/7 t>o]b h'-va7\s#;솩o>眶ٽnl$Z& t݅+qF_qĠ!t}GY2ئy+ohU>yaz}S*Qpvϱ/Ib؉{,3]rh(n,| S6ZJCv{h!`$ 2$x |dK2:l:iB2Qda<B &[^;d_ZTwfIkW+h@bzig8x]R!AiȆe #]"j sMmH*/D$'#1B*\{1 CeC&fCM䖦!Tӓco^=mX/mʛGR!H^,:Y;%!CF&4h|pUQ1fOяr%dn)`@eb^D" Τs.HFjlFzJ=*Z+Be,{cGW3^owrqzF?й~~o8zርN˝993dً`'w.^@Z-(K.QfP= ) lJm& 6LCdV2 DÏFjlFl> !桠v5ueV m ABQ',2R^0_Q ]SU-)v)=H,ѓ;$ P մcW<{G g~[Yq*&ݽ.ߎ?ˀJWwt̒+E޶+ Z:/Z~wkVyQHy `0V& 3)E{n DF lecيwyD6RZ'ٻFr$WzYtIFjӵzeO[S*-\ pɇdɢ-ML1I/8J(NY)XɆkZ ZY6Ir`ݻ?ȳ^AĒW-o*rsRzmuyئX8l+ey,j%ƤE'WbV#)sg1"Ou^D#\a:wQ| DC/MWH[#j/93YfuN딓s/ : F\'h13n"RD66hbBw8]E[b+]Yr_<ˤ^M},1H1=+uuKC.)|dNl/{^"}z5왂І",)E [O1'>]vfR`)-.9)k|^p潫@G(\}#VE6?xkoËsPΐvIN ϸ A!\dgiT% #xHQDY4nЀ`s,qZp$3HMsq`KF3_d!(ʆbƐM" ,YMXҽ@#T6<26= y^hMViCɱ9%BeeFzFI+"9 ByV84cjZccWWަtvWQ82P{ږ?. XBVK5@(MЅn:%ׂ;gP Zr;2y^gLI&)S/?~ͥ[ٟgy2t̕5 K\Bw1>x)Mٕ}R; ]:Oc=X'^JE_$}).^?.?aPxq&<͛&ϯ9At7e֜i==:٫{nxuDҺUx8::\L/-j~SY=@16Y/(b2?2QS3ilR*й5Om^<3n? 6?ha2|[^P%ȗˆc-؇,٢`8b԰S_M<3," /˶޽OMgS\atG;XtK;l}5".n :6>{ 5ihy3FF]L]OмG{nʻl41yo[k7ܩf!jm7|³KvT#WJ `8J%M4Rrc$Ctpdl3&d;0mGTَ~}3ΐ*yh0 =CP"hB:=9SN:̆hƔ,MzU@/i}:Vv_λXu>]SeP"/D{V;7Åa1ꏛ?T|._0l–6#Y\Ǡ`6kktr\x%i#"K4xydMy(| Koٛy ]뷼ܩbvcm;H1F)Q߾!Eyc.'\ʚa;'@0hm+K+TolX!D&!:# 2&T`4y[۾e EAhMYQl=]ky%!)Ի!ˇ</oz?_/n,7 bT  i%e![Jb! V!d 204\U`UEDN{oB "=Ҋ1R:z!mu|9hBYi"'nP{A /2 [D&%M'`Be=&Ύz6׫@V [&jI XglrT i݁-/L2-GDV3O"5U%PGg/cVۨK3I ٠ŝtdMI5rnedQjT+̆Njvҍvl㙇5-Q,W*I@9ˍr9&2K!(܀pTL[96,Q3( xe+yY!sP9f {DY:MNi#Tϒi[$u帕zz*WmQir~c%dc'mHe/{2Rw೽` ,pЯcTHʲvgHD"i[ּuuOOTW! /94LY`ӆ(ɷGN^Ԯjca~,c|"q hPJInjaeR&qjS5U!ip t B)jc$0p8JsgՎ1e eh ,dڞ7%Κ5UzOfb?N,W>a$P}2|T10 )8\Z0)<$j '^S/Y1琒`3J߽"Z`39ʛddD:"XH@qR*s^. m61;0d"] {!ccPg뭣Xg&Z5` hAi* (߶--_f?ozZ Je".B.i= 0-M"aT84/hTxd;H>'JdL P;/Of{3dx BsEJwS?>|1eQtøOhM]'rGEJS^^0g|QoRW 3,Y9tO g`X(1OSN(v8{s<! ^Ǯ)&?$w|Q4Ht6#PhSx@QSq HǟVzgYboT_tt~[ [u;oK[Axpd_O~?¢WNF'J0h᠉gNF_"lp:ƹ"L;6^ZqKHRJoTh|uQ_݌k>:jt>>vF ߎnYfnGvQjT%z{n{ӡkNȲYtX KU8 C+uo&=ms=9Zg.U :VU)~Iü4NfHUϏ"z"{O1- LZMX>?]}} :CL>U>VIV(+q/arKQt?>lR@W%ju;t;8IۏO߿}}ۇ7?RЧoop- Q9A[Vw۟mZΚjۛ44_i Mz<ۥvyI]hW!]* 6 Uǒ 6ƃ6|Bs=az& \6tq'lGQKh$%֕\&R:0 D!)?w5_"hⲦj&.kIφ4 `3XFcFk %aUGi=ZIt^D  @ CVv㔹srD-J%69 5'9bciؼOIaӄMYbՉf u-I8w~}\stbTGWݿU\W'W{^Aц}9VYQ SxTW _LѼSz>ۋW?|'\> 79X~/M_ߺƋ:*k6En/GT5'ݟ\Ctyu7畛ׯS#k:![賖[k?mΉ"c܆dj K0!1#QeZdRFꈎ-iMj}3فd/LvIgν $\AqMRIb,iQHA' ZG#[i} d%Wq8qQJZq@tLimr'G9S8DK^G>ǫCC?ŋf+ sg_rT n'nZm'ٱ6LJl%Ɗ#d0,R8\p.yVsLWU8Tr_R9& L^lQ ՠ76P*|Us(DxbBmjcqӿLEtrJ1]ls1`WY`P\MءUpI L \eqWYZW( +.5 iZ \lSR2h• w3z8pep(pd*K)y W/|WPspkWY\;\e)[zprHpfî`,WYJ~v1j q熫lj{QҊ][qR2_pW_yˮp(T{[K;fēG9ma\s&7S V+P:iɭ5v,9w[pcs[a溌Ȋ/x&. 1ѓ!:Zۨh6e~xY<#"1ItE0UN@ƽҙsQsϓwDG,9% 9.wqݛQQ|]EboãG Ƃ3u]:ݺ|oƌzCīk#"QՄ9'1cz2 ձ,>,ʯ߮3Z={gec\r:W~/= :͍#VU!Y%bsy Mr:o|?{}ĜDHsNFƆT1ƦsDp:٪zolO%8XG}Dg6~"5u^;jW\8;4ڡC1vǎ6[ud˒aIRT6PcRL/*;W-^cuxUR<jdkfK뿞x`Fɕ0:z&|BIDù542gtJTĂP)IRXJіei2Q6^;3F?PEWP\^1ykBUz㢘>w1òW_{{ޒ|qY,gw5xwgd'黶8fIn+7-_]8;kϗsIV7M1Uxaexk-{&'qҹ֟L-"S&jt^_3^z8zIC%P捤hhHמ2 I ك-rjc$4:"9P* VG#$KCM9,NIxg g<%>s;^v>U)ֲ|&R19mm_yڗ7Zb/TՀ堁t ^gWfuw3\LƃKiKRiA}jrmX.K (O{#KКW:$ uMK!PBUS' U NRRlQ".XHH@mI_#e8&=B?%(R!)Uσ"%II#G$]SU]U]"4p,tjBPƹIB"gP !_O}b)fJ*؆8mX7+0C.i‹9bbw}-sWtLS7L RWtkO\\brmdh<\"ϵ 50W8ާcE`_ rZ@UN" m [{ͬw]uʊHs]`)!$*9 Zƒ`gI 9 Xv".8+pȇhV"V΋|>z |/ h F<1p\'C}7*l6/ng6|\r_t/\IHfJ8 {!XKj ;MVL|DBԪKxP:$'_71roӡjmP8Mn|Ʈ/H>>y8i5*++H '2XRQ GBL`Q8B{3g` ZYKhTsP Xq4I Bee;e TA҇*&h^A3CMP3@p&eHw*G9K]:gX)AGDLZ8'='G% \Q{ '7}ztq sx聀L\0땡 mQ)L:oWX 롧Ψڰ |;'XEWBp.$uٍnc;ّ\w=6?WZ`gwgmޓ PWמID΂^J"D_SLaMa^²):}#;g[nK/bG(σOiYi_ѧ2F|pKPLIl>L6?wUgm ÎЦv70~@032r/|H~5 o LW7^ , c9d~CdtįrywY[-O }ũGR]o0B磓QsE)3jU+{t $+zj'a\lQC8)UJ!"UJmJ׭Y<~^3<{ '2{L),'ZW{5Qm5AcHS[G0̈<5/Gh{cmCV5v}\jZe-<6 fUp(zoU8q0vHf븴-a7{<^^|C{3Hur2{ ^VV-j]^jhm21z[i[Sfꎏ 7a4۲Au轌1 o=by9DVSjDZ\OXqgd4g VZAk4:n?xI_E@•V{)tܢ򮄠Z +oQW'Ya[j5o ko<~lh2Z|gtG@4 5K&PUVXH[hAAp tu2tRܯ5]Ɲƚ٘m6^ s~Qf.q : WXR;G?S1M3duG:k0wQ*IP&s FƆTC_U9uDZ>kv>{}°-h}!㎓j֜)ϛg8'^91I";J( $Ab k݅֫)^XW<57 vqXbԛ m<~Rs6d E%I(diaхzqX[A%m=aiW3zg~8բK\;eTH.56EQj zk,HA%WK **}QNy Fͥ:rB 26sJ)ﺄ|g2=9nM%ǭ8x2J5yt !yGkQgOĜHzs1pĉ)3=,H l"6Yv=jr8<1g,|B%7 D w,I+ƙ4""GA90|KYN:Ftkxt8{YIW2k jN MP(xAcd4VpMX)08lTz|_DHFϕQ&(kDb}ฤVXFU4:'N (5"׋ZZ~1^OZaXJQ$xŜNKk6y[P!SG͝G7㠂D:É{ b],{[瘦^HrDy()$iTI|\L%a)4ɠ| U=ugA.g׭B|}wL|M>A1$7qy]R qHMrɕ&uڭ㼙=w8Q >tV氫2ա; ){S.oe­yh~K/ {J?.Mlà+ qtQB& GL.z%'8Kw[zly[y[܁1~A>Fx(k\6Qhuݡoufb߫M^nߚ8leёyCvP',~WjM:hLмupQ63d^1߷8?*˙+Z{H\& Wx.A^ȗgZ 7wW/U.N}g?;B5S*,y^0\Y!,`z{TUo;G'Į20Nt ebZj4 >qcDI@MBeG*Đ4t7(B31RHp# ଶq. PpA!7s2=ܘ8{d]!݌ohz2ly-wxɾ-}r8( Gl '$rFHW )b4" NAǗsoH39oY&qQt( rtP|h@6x~ǦۚZlI>y(Q$:N(ԩt.#% 7dqƣ8YUcܝL-jc;?ʫY\ؒԻ;EoՏ fPA ߧN،iaH\,~g)!qv|-Jx`mEuA=ߊתcrqZaU8 Msu4Ii\슡mLN[;w@HLƴbLmӰiPlfU>!8< w&dt2j6z~=|ز8y묌lYd۬kU 1icC!eu"D3#J|7Gt07 Z,-_rX?~~}N=peWxe%'A]aē0#Ve1boPe1%&U.NH|:w???׏xOǔx" #AAnM_~wjMw55L51f]n27DXCB4J:*%o[uuYnqK|o\Q"(V'o0,&\Ga$i!# s)W>R5+`Wpl|LclV ܳtd FRHE&BEg(~ A痵Ws0~YܱG','o) 5Sph-QA5(RZK4+bq! cHP NScp@ :g( j`T9!95t,#vΞOś),o4hUub;ZLm/x]!7o%_Z}\s.4bcܯ1cŢ5 w{VSKY]ULϏ )xw7UsaMTj;vum{kλY}䕞a6ni]C=p~:IGw l-5ׯ/F XsjnnL9pus*ю L,qБ"ZPowЉBFꈎ-UoTe73r8*;#K"=gweIz.)@?xÎ #OlR&R,od),E Cb"3#t٢@zuKs= }T%AdiLM>!K~_p>!_y'jo'G0){[DOD HG4ǟ?hHC!Q 5PDpJ[iPz\6G4WNN5>Vr;هNm؞lw'ۣj4Jz%iD$2fK&yɹVnd؁P$J!$֏x8e(nk 8l :%*I% Amdl؞b!aQpms֌k W:'_'~:/~uCa?rĖ+V-f brFQW@2Fr $Oۭ85Oː=12lrD b&&1@ʌL>&n[#g;b0,ڭqǾ-[FmѡvnizzTr <$,k+V2Y !#A6$^V0 r0A-d|2= &D%L{c̉ĺ2ǣQmoklҨ7x2싈eD"v1JS HE$AI&XQ=H@ oWC4҇qs2EyPR(Q{4ųͣm9 :Dj8G.5.u˸;\pq+Z0EbtP AN,RgIP:r@tx \<2qa>+5;.oN 9Sn\Y'}n s%鴡SjKJL\qː` ӼLM:eH-ԛ!@X׼ 6/Sa` l*Ap2r3LtpJI9iW`E2Z \!؍L%\A2vFp l 5l:YgjURw+p%闇Ҽ0\F/ W;Q \F%ӂ+\zJ%5 f \er=ypTW̕>Br᥍ݨpp7WN| nx/_Pٛwnrpk@TF2gpi֚B; !@a\Ϳ^.,dɡ,Bu{}txh|lݾG]S"b\U_UWFwA{N&I; 2+! x0HuL-P4)}k~Tͩ |E#]H|a7A|כ|oO~XqEvj}zql'7YO{;wO/F)(ɍ#'?|\8aKYuMy}kc(۟;ת Vp6&Wk5S{ R)oQ炘WBϯ6zpeEHZDxZIETvxr'9,nvO8Ӝ2S@)ajAB.L$`!NQ"t+~J6:k֔z.hCL$҄z?ygL'-HS:ڮ9 Vy֤S1j陋yfFjd!lC mSj-6 ە\.kS1kfE_&yTJh_Mԙ\0 LPRv&@<}돔}obD3BDN H9c^(|{!ة>/07L%"sI v ʅ*]IιT[ja8z]E`_ rZ@UN"+m [;ˬ BN4tʊHs`)!$*9 Zƒ`gI Xgb"BK .! Ǔ"mps?b/!GY/d F-qr?ӧpV ʭO_PrZ?˹P4'0WKN`bXc}(ز8Ww}W*@Px_8FQjI,Vv΁2Kb>SiyNyݓ>BPm"ZD9#9fX󌷰8)""fNZĕ-eE7T ضR7>oؤVޘ&,;oưV=nⱼ@{$G$^3 W<% "7\RS;dJ&RUwĽP:%瀚Ǡ;7T6~)KZQֳ/PZ y*g\StʴcHM"GBL`tpDȼ^5)% |[AN+GWa?*eCYI a%h;?B{$VU؍ 5mNO9x#Oo>K?(?W+qBף8\r4^!MfTՈ4=z퍥?yBh_&:_O.gk?޾-o]ͪܒ3kg \`; *daY2GΡ;gX5fY>ë&shZ~i MA 늿/l}6\7< L cf̺93UC Yenr2 ){oZE}P6ؚYw|%-eI֞..yїs99J6{xW ë&ÛfWןW:9f8&;@'DK{zsɁ|/RCȭ'x/w'!021+<( K>9%W'|-?W[GsP);͢4θdGřl!Hͽtp8w*CjP+Nz;{w<<&ch+LT1$\;nQZeSE\heV⨣8N[5N,QN۲{A64!+ę1ss`Ѵ za،+tvqK[=6c[$zr(WEv>YYJ^r-W&DU\%-ۈL腔 ȦQ zcM]\t]*K툚4mezy<.]?.Hc)6N)al2F9^n|Ga7x]w1JvVI*l_Ͱ>>7=V OCcPޞ53_L8Cl dMw+LZbPhO"jndn{-rOKu3Cq׊]Gb[5chl2% zCv8v4^R(F#JHPJ& Ձ I-R 0ン}< E;mëd|ºU"9Ǭuڤ|,`7c1)&q%oNQb[V{z-fR=? O.ښCV,(VHh\$).3I"\Eн AY&R#Zg4!!L96teH9}gM5HԖzX},%_KKm">avzҌxC\INR|"oq:ޙnjc2y~<||_ xa&w}tY^Ж54ZZbh+j|}5Ƹ.ܠ7VXBJeS c%>/75͟`OPbs\gҢrN3IH11RADβNF"C{d/~rs28К@&?Ơef}U,i 1?YZӲB'ɀvj<\xd%\?Np)V_0>b,iJ~}^HU=h|vFRR_+ :SP>43IuI!!bZ!C&rKݎ/#sI:V3ɄlwRqkk= C4,erVE2Iche6Xh eHч8YFLD99ީKFY5rt6tq]+LǗqlxZ.:z\qTt'U[S9c.ICt9ySkYp̷;w=}LW%r#iQ|\bylN s1nPZ_Z7]t'wtlswʥvH-j̹un7M'=_5y-zr3?G -nv_6K̄T 0aϚy5VՇYiSR)d!ȼY 隄,jFMdCɨFkF5Y?b<X?vՈFԽF5+R9d)R"P2 'P* ԑ>ErH6F 4\&"=RG͍D#Y,޶HZzW5rֈCZ@zq\:$_g5.U/zz׋8ހJ1t=_1W)~|GH;W~lO5ޱw\ [+d ]IYǔ* 7\{N*7uX`҇F_K:5Ыی^WLbW@LW JvNI̓n?+ 38yvxU*M^<>y'g>G qD;u�q_jG%+ $-eI21,9kwrx>m)`@*AzvΛm;G={7i";=[B{nkz>6:>SX XA7LgSitWg]>AYzu h@8KR/ƣ%׳}A9_.Mb2ק?<H $m.Yƫơ ' >1P@֘QpôBkmcGEo{:H`vطYNGR,Oղ%-2eNI쨩jU.*fW.$rtSBDlVe$T%G2+A+8Fu.3[6# ޅ.hvD|nt؇Kee61xz ,ڨF\ͨU4r .gQk:G(KU:).{1Ǜm&{ˮA Sz{!wlsB;43h"gFw-jYw_ʏvZ>->,&Ġm<ղpP;m{{{{ mf$teFiRLNV. 7!H33{Ym˷?~mFyxTneQ_WЊO)$N@W6Y) I8``h8t^R+QFߦ>3կL?NgwӞ42ssxeR-ta=򤒕@OO^_W~(.Ҟi:9B^{!X=M?5|@,P/1پU߲R4f9hKbc& Ԁav]Tw6Cu;|\]oBĻamt˯w+ +A* \7VBLZIo%VKO+)4rH+J 02(3m{O&SuS#sϕsF w^4,ӿ^s=R1cED8&Frh!92Зj:LL\=1x(Cd=BJq!rAZ"Jr*o3A ` EĐ Y.]k:cv;H]ޅ7 #: Yys_) Mp^RS>ŪJ)لRץu@j䍈46E 6>KV3dY5TRt1k$Bfr'D"#󁹨MR IC7[AAe{M642n>ݽ@Cum1@Wȭ!fO2]cZ0*Mx$іM 9~T)/lgD_$m~sܷ k3-C@a$BPJ$i`.;MV3 )0dĄf(Is4tIĎk}O˓nʱ?-vL.;~?:^Jd8`v3ByȘKJȒ$#\a(4Y4B厍0(WP hcEf*CA<;=KJ,ɇâ:]r=DEwЈdψ6 ,jg=/ͺG+C 9 &QZD%&{!Ŕ$Z@#-IO%3/ozzUϠyޛ1`ml ,o=/P)^^{TR&p&SE¤NϠSW9޼C @,|6: rIO`&'AhOmts@&wtZ][a݆=2{, :4'ږ , j("ݠIЅfQ-߹WIꔵ@f2ى.aI9X&p5t5~JA*wSk`3 ה4gG{NbZi=4ȄIIlgWi?*omA7 B3iuM >Rpqj> =}". 9h0fj'NzBn?^_yJ-ܑ¯H-6VH/JoyFᓗ&7S$[Z&mcCiWj_+@37n']LhiU+eJ톍D5Of_ܤT"Q7GW6+5v@tRvJx^ 5 ,(>$?5񾐚iO-MG^cO~|p̩#{E>[iCO־QԷ5}}}V䑟JCG~%ם[oq[͕=x zZ.]) ϷeIv-(]o 59ṛ{%pQ= )>([Q]K;Tz>$W+$_uU`V8_ŷ2}Ϟv1g3r;gF'P: .s`o2nㆶ&{P70{9@ㄷXŊMpђ(щFkOlwÖēy|O[&Q1=8OuwOnmFCCiv \k%$sdχ0h<8$Lp+& Y!N"UAi_g [?9:z1|bB@P8P 5)粔sAF#F)%Yҫ |,3_5Ol5IJ=ߥ~me?vq7\bP*./|0xBX/+4!Nk"5Y# 8bnr  U/UmF֛oWq]Coȱznh5Ihgo5H땐+ؿR2w_lL'53adUsK+ݯ *8@IX&S i됻ܪRO eF h Nt:U] L\ai-?[JY̒! (ƢI<r|dV]n5m搗;&^[(v˼y+7 o"zf' V'j~,nS-w̧|ѐ_Vb'tf6q.@hh t d.J2МEɚh|lY \k؍YEifX՜5)* d۔"༷ qAe&˞&x+զsdEn~ٸj-ܼؓﬢc~n]?TUzJ[YLn=@ۅ'=]4GIڪ N]sv8HPiߐI?iRF7S˺}%|d=(? 淦1r??g=WDu|>[?@8:29nAzoo=GzlW}/ׯCQC,nZn+\Zsl2ݤ VE#>.L4ɡ!N3L%R &jK3!| [#X `^3y DLTZL&{ʾ K^x8];~#]2i|Jx bL/ K*$ ɰ a\½U53V0=$ن2J][LNN$$FH%k7a,զsXPW(94;]Bd5>y멓݆fOw@viU :wN E Y16֫IAp@CݮEMTDh%$n)`@eb^D" Τs.MfUꥑSu!Tօ0³w.?dAxzmE_ hd2wicS1''!1{IiYe B19}_$ `{!EQR 0A4Fɶch9̂tkl? Rv58UkZ[ Z{@rhN6VYdG`_ ]ǃNUaVڸQq ́ !]KsMt}|ȨNƹt֨_4x1Me8h}ϐƭbVd`e.YfU"IL$t uEII6ke4 r3>t󤹑 C KZHXF6#njOzq\$_g5.9U/be^>ɠP RH3:Gccgr9b=).CI5sKчոT}h+C{>< &U !Q([?3Jh/]&fP NY7^5NCm@My1;'񤈹!qw ё>`'{FJ8L6/[زf"WpZ%@"wO}!YCFF΢􌆟//9{*w[z`π9PrJڅhCˌdX%_k1ɿ*j nW[ a?Nx^crinwQGA{]\BI\zrN;mX0eLbJXH)EՖ$xQ9$Yg\Gl5޳$4tJK"'rVr5:<]ǨkR0VH B" D1YAFcüݖÇ AJg49se &AqIh4';@bPO?'ͅw+YJ) 2'}'IKk6yU@N5wVK8(q\KS1* u+4oo'w1k D>Q.SHr\L%a)4 >A.F'{SNl+|JKv7 zqArà *:(C[>e˹ cj1V:t z1Άa|n^_IҕA) ȡ(C=X.6Q`.rP.J 3./6PAOsj\PRfH "C9h/W;[. fGT?<D ( fr@J:n !2 r|ԨB ICA< 2FUgՎseL  x#c՜dGӓrE{V9o,SJ b }%ɭe&%)t Qdxe "A` ӝn(T| ptUw#b JBB P*["K6J^w[yl*BآYzL ׄS,C6Z l$ cH9?j4bn{s`akNRL2u{`Hg\KHdp*@a"q`L msT#Dӟr?VD%% ʇ9 .H;+2!:k57fOg_\7Ph.?r~h7[boџɶyO4Nf*8KtY1a|n}|4Œt=i6J`h V7ɼa?[YE@M' ;>'ǧ ׄrlx 7짖$$q(^HЧq)KdINe>I&|v~Z*[ hcmJ3l۴F=}zU6lQ z8\/g=w֛|eu6URpIg;>|nb+UjVX֨&lh{~of9[r }~p9;n&i|g T8GYcW~2ݚ2jP%rvA($w^Uw=e݋?xh uDx FȝIu' ݁/tZॺf]s ֬䨧_Zv+ˍ~%Шb;j_U>y"ok}Xvd`/>G NRag5iןTG,P~9KMq똅OM391A ꝫNBmkͬ>f>^tA/@| dtO 3ޥfo: ={:@fIov\;lnA HPVj*\>j25p /7y0kQ/L2ee_YYJ՟:{'1gHT!9d_ˊU=|~pwV97.Δ;w{Il|O0v>}mQ?Uv7GզA_f5y~0-FǻBA©*hBQ(8ȱA'ODu\vU"_˜ ͸=-@ L}>؅qIwmb!"Dy[h՗L+2i\Ë~|7itI;jn'MTB5tj>:9ͣ)^ijx^=ǿWPr܎Ss٫3[;W0꟥+v%<`,8Kpb;Ĩe82 F8qP7 ǃ%t!3إ>^Q [Sy7sp6qtkF=}GXI*>wI朣cbD':_dS @+XD֟a~1uhKǝPW9G YtG, 6K5\]*d4gʘ+fڤ~֙rS|ÐuDf֬K5Oe3;Y6.e)%an؊v^L26x {j%m/GjtݷGs٥2JAlȝ͈r1S]:yD ǕEE6-QW__yˣxvxqg\ ?. }>D6W>wWDTc;nyT.G_("=JkippE ak-#0K? r/I96{5+FM`i6 (j)#׺&Ԣle󭘮Yy(c Mg?j@-ۯq@l\TKh"QuZ ub:ch VAqdz8mv0ڭ+u(VǢR_yU{p3ꖴ,cʉI .U9țXm ,1 d hdѬQ4NSV.x뜡(O֠W?:)FNl*{o2OO,gp%=,Jiw9^+`Ș+VQp2(|\ sFBT$H 2OBJ'Sh_XPӜҀENt]Uddy{υLFMjdT7ouswI},gU׷]x[|'ʛŹj4Jz%iɹ$2fK&eSQݼdGU$J!$ޏhtH2BdCo3$R*&TR5c1r6krX.,B( T)c\dd6ashr'o_m?Ɩ+V-\-*7"W@e$O^5OӐ=1βRy b&&1@ʌL>&.Yc ,qǶZ[⠵v-8E*pw<$,+E|rA $XT&!UDP@{TϣB]"EDsGb]H"NuT٬>WB̊qo4b)V#Q4A#-[E$AIȵY$c JOH"D)!(}G<_( Xz5v%U,CSXyƗII:#mbE5$ӭ"%t wVV|,S.pz48Ӝ^=.8_ &3vU{@m|xJ mPHU:%2ZJJhS"9Cdpsa~Kpw+m)_ӃFO? KB&@[j:t7Tn?kU7]P9*ˢS6ey,T$TX)t +vhsV)b *oX8YHZo%B(FoT-IS!%jUVH'GTYV23V3EŶ. e2JKp<HYqzG79wcxG;sѸa_wWk7g-2録1u]M,xpR9ڌ `+Wo)}Jq% ~ }!ml")$iU"XL:!$K;[wlΗax.%"_)}}TP\P|/Z $CE z$ged,#Sdଈ!(V!t dsngۀ]` |4egS2 *TҤ&s!%]VZ((ABol8&{06i W:ۿ#8kc%H KBi1S2zw(ٕ/QW/bQk||8;L; b 9 zXkSU&[̪ı>$*Aulc|,Jei% ;p$4Z:}sDYwtfX.w{?>oY+5s:|0D>tR{9'O$&P5E/F:'0g!:b"+R[xDj|鋅 u1x|vwC5-s QJXl&J9I)*,a6TH]8*#n]&!ٔ{Y0Zך`DDdQhX.wRLEC ǠMqM 8z#-qoz/y|{ji-SL'a{!0\D)$,kI̋D$BeMtZ:sIF$Hi}VL av!S"G8'Uh\~^gu숱_?ks)o•?ǙEX\^+:4Q1RNgy.|* ;LN]H?S3[K>ՙ{bxz$SdрMJ zq);&\=7uo2md/c5Ê&sQ,yJ^Hm SA1fݍ_iBv;:J:=12v.)}z+Oͧy}^eذW@B8x#u QM7 L) [ ,qNgQIqvp)JtKfŋ>̞\?xu>=zY83&rOpGx9K""ѯ-8~Rڒ7t֌hmgk3ˤ0Ѹޣh,|<:Lpbp\V[j3ȶV[jIM 5_gT}>+F|xZ{SĢZ~SW+Fp;r։UN2 ܍ 6 hg'< ̫io-Bb(A_}Wz޾?^_Y'Zi t\tٺ تȷ7M6 Lɮ%5~{!kZNbImz6n\s cHWehP;O%Fbuk _&!iD$-P `B,;ZY X%|Z.;zw 0 8%  LrVQ2-?LchYW&D'/v:)hor핮 X. W6Lc5܆i|flHvR& *&U|hw#O$T.w<ص@=ݨ/g 9(\y&gڸqK>XVg4TtÖ4L( w,#v֝5)0˜V(E A(ķ VNr=Jǒ҆Ht:wNpab,}JGABȑ 1 wO;2mHR;^!5X6k| ?} ILl!:mc/=tJ':l|2I)SIJIEfv=)e5}R0)0 70GS|U5 '0~2:ɎW+K*=VYh&A6W;Fh%Mb"t :SȢQV# uN[ckJ9Iz*&霹VYFPȸђp99\s..M]{Q\/#c}4(4L\жSm7\}&ڶOQXo)nr׎ϤHYu1WĂnՕ\6\$-t>mˀtGLaxiSG2\<<9f^Eڛ){{A45?5/6hM,3\jff}/?>qNr0Ӧ,_Vf?1ITS:f]@ȼ3AfU>hB?>~N6Ul8ڇ[뺬q"n[$k֩^7ѱ7 | Mua*EDJ>JY+L*0"=` X{5`91 X'C- mRA]Ip| p)j asB /1JBP*Uʒx(0 4)AL"3saugM3BE']"Ft?;~ר8oWz!V|z4+V/iWvܬ} FpAqIS qvCs$S}rr$ *utκs;(g=cbÚۼϞ,擫'o3{v\ѫiIM3Cw8)&(ƳXPJUMNh5WY$Q)bV*)PY_> B5ܣX˷{3/ֻ|*`N^u+(NDrsW_K{̕Gi/R=H%fݿtZmRctV>@"ɩR;/*7y%ME>|07Xb'9c) *9/GgInP211xZ(Cd]թKjͅ.E]GqnpZ2~_Ǘs6h),>}yz*"]CDG5i<Ż7/^@N!VT*T%S%4|C{\ e7u" 5TJ0s.t v9S !^h#8[xI.TsZ6*\0VC]p&KMgOLo9 I0ϖH{6$vD݇"=aǮg:IX @$zb}E4$(Bэ++Srax4<ǠR߇Q ׹/: 3U?3i9(9A;'hh6[B}jEbP k0PHt8U Q!@)$; Io0+)TzӬc2 d]&!5>rW41,\مR~&B͜VtʧV\2ZIxDmP̱,"T@N [a}H,ٺS0xofv1;1ncW4xM5wFףMy'y#ri3jNν;i0Nzr~9 :nuLGP9EBTRYs@5X[Ub ~ o3C 郒1.J\;kWhhT/N{^WzW4B/WM.4x¨ƹF*&ܑ6  ~#'^gph",`x13e$nh`,Qa PF Qr!!i: XH>e1E4^JJ8 AY,-,xGI4 u3tZf ]wͭҖ~%.[t\$p % 7w֌Qj'U cR ots# c b $\ba- Z` Ti=5p΂#3/}='KoMH.UH]*"Hc` -NCTZᘷ*NiQA%f ,y:<\7nb6{Aw%dzzCLH}ʉqRc ƈ2ZZ8",зT +a@>s}g\r/v )rz-9^tr qԍ1 q2 "9I x-SG㷉]]~ t+pD~H]>/)op~ycj?<ō7Pg<A7[MZ,䟥|鱦ɲ/gFחKݮLt6Wk_E^/%]Ϊؒ3[g X!K32e kq ԒT#E4QxeTAQ)+_q d~LYZ~Ao3A};RV w/c )l(Vp׫V{ñn_3;f_ȚB׬!Yo݅.@u>k3Nwq,J ^ہ/6P-X-Wҕ%mr{ˎB)ǐh0%Y)k-@rPfNҜWX6J{M37Eb4@H Q4AT[=@6^ޙ$"C;.(ungpTӿ)1;2N'PzN;0"+o^fZζ2 #>Nw,P佨s0xǷc b'LԬ"#sc:[ Xx0t%_&$Ljj-/OvL9!tbXN(2;o\yE^r8Qbt`,hB*蝜N,S~{ e=tWBH=/Qf##E D2e0 EOW^Qn%Se׉KE7J"Z$Į sxM5; O^V=*R*rAD5V8RnCy)<$aE72~XpVd;pf&Rj2=ao)}3 +#oxJۺAX,a>Dj>@wT΢ ^ZUBA2~ m0$g8zuCQ.bYZp³o%VCR6yn%ŷos{}^Mv/>FW=zUe'r P+>D&c~pɸH. s蚟 +^a-j^>~ )[;ܤ  x mdB ׶5("! /+I~ xc flkvbU;).3rߎEGpݦ4<7;W'%u vMR@Uzw@|UobLlpgOx14)IDfe& w2? :WIZtiC xmZwV?&ɽJFZ&2BMPXTeZάQQuё4Zn=p;2y_ל|χgNeBeӈ;0E:Jd0;΂ڈ=c(Nsc= VownPͦ0/0V_yZg~͊]mos퉂o ҿYx.8t(hsǜeC=Դ{ m4G HgG¬b:+6?V1~]h}jtj 6Ӹ۠]:һc e & ¼&BP5; ,\w5u]vI)wZW,5rʧ+Œɜ>rlN9}6liB`33rRPJ9C)g( 3r7߀$yN09&'d ar2LN09&'<!qdP>i/+xbB^>'gEbi@wpVgWa)?%Ӳ+::y*\ʮ GS n^GI*+5ă6z&> 2h^FJI :i4I+P2u  ( e5B Lc.Zfg:FwvU'^HR_0oC& 䕯Ln.4 R?>du7f |zt?O/JJq9(1 Մ1v +n;ǀZ q0Hes p脃3 |™h*ZkYjlhL1z*A"\GN;3P"4@+&-I5"#:)ͅs$rŐ*"i|:-?6 TxԆQ:`1`ޡP/#I"{`S Yx0'Af>+_ r_hZ GKcLj_ƺnA+aG6ZZn>~R{ y)AzxrSa4VDGW,g gvAQkFpTWhbv H]2ﷀۨi !&> CR{f!2rc!쳠]Dp~ uF"2ꖒm` / WXNE-&` KgOÁ>|fcad+ ٗ;X ,)VgF;lh W+{uq\ژR)NH&Ol9&1lW}VZ<>-&&,i=Ye\}:'4叆`ǎŠH$:g(Ҁ aYYΗ ˭f=&uQ_O`>/x=Afc"Pm=?Ka8$l δ ĕ>*qޠrE؄ A2 (@ǬQFYJy>%Kȳ ]^тOd~GAöW6iCc([mc{ƣx 95 @wL|ჾݤuL0$#)#F&3,!Vc R5H&#q= ­:Y`/M'Qf5fmZ#Rs)C9ZjށA`Qa3=$?L폥ZeED6A'*( SB4r0s=!"3RQ Tt2BD,(6200?ۛZ/G55bf?+b:0ލwsI<" p G Iރ?.Hj&=RDB "VA{kRbٵqޕ]'E+_G$ކE}oÝLY e)F"r~/$ fʎngsWaճ_%PD( H %T*lE ] 2t¸ԍĤY7_pۻ6(V VuSE^ W׃ՏD.Gs%mbI$bbt%#|mJtڌ{T~,uU>xͪqvS+u8eT/Wv%P6f^NsiƋkPHZ7!`"Y%>'L4d<zsx5qTvZOiƹJɖ4KC!i`%73^&m@" ?t yS}כ?ÙKn/.%ɲ_wm$G䐱zaunI FJZˤBRCU2)kHjJctV* \ka18ЕKACLVz6KrjhM9wdy] 5Tƈ oQ o cg%[M?^L: &[i9v}a=@~f:1Zln!»\[i@tq+YMԂ tgω rx/<44bf4c%8-AȦ[4Q4x;.b+#7dfݧgeOy7৬WjhJsB7zzi.9tWhOyhgUuOM죵WvW-6 2x`Ņ b5k[W`,3Mrhn,| St`&=&BxLvdL YvҀ65zO@U ~O*qS0яUգۭY˙~|Ѿi@+;:bf7xՀGِmR+bwѹle [wޑfN>|<&@y~A{,,{n76[Xsp9y5N Wl[S6g#XpZi/T*@2< f-ЇB I$#d#{^:yQY]QWWƴCuO@t`|ϼ>h:)$1jSG-_*fufiMv} Un._ʲ4 }[نFy8rc"L5>k|8pc^>7Qw'o6-J9gsT,t;c- A*=P8M{$"灁hM;OQ32ux7=ڵB:rU(lu6jSke1F|.WÈezm:z tmyZ.k|"5<6'>TNP2 *A`& UB7 tΰ.={>-~i׳\` )i?$t\A2$ idK{Erin>6^=M뙢.c}O.GOx4qe|O{BLmXh4ɭtŻx~g\xw?R|}Y-{p{w6_}.,GUsI/HvL{,כulKzŧX*[ ^7mo~zEֺ%W]8%Ia7>J_<.EmGSM"`#*7Kڥi̞kb"n6NgB+x\9E%LQImB8x޵ YU>v'x Blt1fodaNXjNFl710p"xU$)/m"ڪ,W٣wV'˹m$8LVfJ J'7FURHr` Tcgߊ@1%Dhkhֱ)vboW CYE][]ˤ7e2, f1/2/Ia2 lFkA³練˄g!>>^_f3oΧưbg ۟KiZwa]ƥ] Js3$qV2)ІYsmb웅쏝]Evl?$ҋ>V7ҵ$n]xM_|=-\'2mnݧf߶nт7ݕ[g:Z;xP F/qP՗Fވ8_R|Dqqr9fpCzn)Qk\# &xHl^J@++^_KKFTyP4|@.2K HJͭ䒝N,iw7_)`K8?3;f/q~_n:ky~rnC0C8'g\ $1KsȒL!$,1l naP 8"3Tּ*y3\ Rpbȁ7V po#^^Ռ|8ݡ 䤐*Fl&pTZw7xl٬xA6$G$Di!C@2QXShI]֎R$uvP<d:?U=yj(O9EX4z^<'RFLF3LV Zeݟ܈TgkG/0!K2_N*Ⴑ\Agɉlhu6H9&O зº-zdvWQ80/E-3AX j("ݠIЅfV*ΟWIꔵ@fJiNkoSd'9xgaiX&p5j|0zz΍a?yS7de2M&Wx-Cm m.WWeN6vzꋀԗA}`}J<:y E$>.n_ q - c4b>hYHrӔS\HIWH]_Z}.SYm <O/0y-κ -(?ɐXiy9|qE٘]^p^f%pnܺx~)k4//G؉F}|'ڷhuE=-CKo -$Z/QԋW)׼*_ҾK=7>6;܀ oI#fNE ,귟3?~/"/7ifL.DwG!9oHus ,ڱm$<;0O˖=3#wz`:s5T T`;@8Mm3\NCjҠ @3^hA&$,6(hTdSC( XKi/&ggXnEo'P.tJ0lk66nzH1ż,Yi^Ҵ:?9V7Q6oa~z̝c{?=kB >HdbJe% c:cim#90H,~1 w{__mʒ"ɻv{zHqF1f !:MJ3?[z-nNDGsjrSN.^ DE H6VJ4ԅܒege*쒐19l0q`bG7R6%2f}"WUJ(PԺen^b 1ET$* CK[]Ų/s+o~:?Pn]­WOAb>ĴXhzk"7U;\ }GkBD B(aB\a;ZUG@W{HW椮̆:\UGkw:JttӟO2@=}{z2,/@NiЯ/1w]Au Z w6z^ۓ\28zk~(%+QBZͦ٦fv \yg'?\O׽%I?1\m"_}b 0ɯ.oO}hui'yٍxGZ3$1?“r/wx0/z 2t[~uϧ=Ϗ*NߩZj0}o-{NF) N/~]t׿tW+h^~{vՍm bmP[Gڭvg 3KG6Hu³jsjm]j@Dh5wOӻ mX~b.?q~;< ]m҇ݢ+]]=WEfDWφ:\sUGyfȁ'CsXh6tZ=hu(>ҕ񬌝]un>t5M1t@W{HW읲s]0Y5p ]tQA]#]I T3ϧZUGwQ;ҕ b]5s6| (=]!](̩wZlu)htQj \Ȼ{>P>X$׆wȇaTpA>~F><'p33'ͧqz?h׵ZGA}5ZR"_)j+>ǯ):Dt]{oAW@W5=)/L3+͆);hu(-jJy򳡫׸UGkͮUGkЕQ>xh싋xRͺ9M黓Ŭ^?-NOBq|կ^Zl%1G rg`o3Uf4{h} :?3!#ѵn_nMdaC}q$wv1,.~8:;o%B{򚈾RLZe/-9L²i)NGYS+?deMBSKcLZr vbi˾'{l%kf(Z{,{,1n=Ɯ 9?Kn}e-AQM E?eXb33#`U\誣:HW]<\Gj.tѲu(큮tjKlfCW̅:Z@ijhXZ森:\=h:]u6jCZfDW%V͙:\sv[ut?t̊`hS+y;Y$tUse—]fOLW[5?ZDX=lAW@W5=1fDW̆:/yݣ hv:Js}+N{?#`ci6t>z[e2NW]!]Qa6te.tJu(-~=+os+~DӻhwwQA]]Y9wVewgo?K~XTt,^7=w}vϽ:o9^lju߆g /+޻"ʿ3v "CCox||eLAw7y <//.O,5ZvǪ:Zb>0ru}+{ޯN{хo2'!9Mܧ}nsko)^F?,5͛ ㆬy5_*~nCs** (o_fLH5xfy:3Ƽ#~!b9i{wC_,΁ ^e?8gZv?/D-e 5jh*YoJ*:MQR2f}$)Ɵw+|GpЮV?7_\7pz<:)\VHY@͖շ䀛J!yc9$& Z1 ^f]ߗN11jT b6FŢ۴hq1s1XS뿄ȏM66yKaW*p`s(H<1VM%rksdLEҞh]Vx'O+T20JpL+[ ɺ(ZL7}Ýmoj̗Z%ic]Eݤ2IPRRı*B@Lµ =z_ B`֖ИM n$(:&ΜU8_{;<Xo?22d*mBJQ*0v ڐ˥(tw(5`*0Bgw@p+ Kk޵,l%+_+_[w,DYW *Im<^\9d坏tlɁC'dHQOC|\f9 zTerVcmNs"HV (E({'p׬#ESSa$ )V 9GZGHs5"De])*2X )$ZbA_2<$dN1Rmm*"Ux,,Ib69Ϫ)cYlU.mq%dR 3l2 VA`&YR4RFvTX %xs"D y;a*V!_&eH`zaL9%Xv EA)! *Uh sVAk!N=4 /5h;k9(&u%ʑS-_lSy b0Pn X"\Ք 61AN]qajP faXL%eV>pO!8eG*qHJE6,B3< ŕ]b-PP]-EXlOLjx*1+V!VG&ی2"Q6H0%X{Bũx(R3 JS㧜2HQcIYuD2KC@z375(!Ȯ`;6QafH57]T?Ae 2oŃQҔ` ePgBEKr Иlf%k)!{N[S : 1  | *dr=c@ {.ET C`PfdB# !m/Vj 2әD)(J8I9n,IgkzKPA7/u*?儂i#CueRsʏRƸ'U]%SƾdK`OA:]ˉu(JD~zJYh "s۝BArͨZQ+|[4q\Ȱ; nv3 ~ c8 9h^1FPTP]!P/HLf22[-.{[sώ.ܨflɈzDPw m|k!`|LxՂK<KJ6`"KJFo'I-GB˥W20w:"'q}_.T /e) >@HE&rZad^e(a4 .eMSQ&H.:L_S4Cݚ'7C6d m*^ ,:7PZUૈ;Qm+ɒ3.$ S`u0Kvnmn98բFl bd(/#fr@4-Pj7% xKdl0RbHP.(7"{ hg& Jd MF;hF L )ÀsH򚡔PT}f^ KgcP >OV+IJ1҈Q'g۸2/m`MݠI ɱȒW.sFO;JO&fD~$xP@DN; SVy` A0 S'B`Z!R*+ y&#>p yֳBme`~ 0aI8z ǀSZ)#xnH @jȱ:AT3g LA &}̐H+0q1?]*Xg0ow թ \|57P:q]IU ZC =,:XJ0в R ژ:@\>b bJ n+`1LJ, zM&H=LK܆R ]`$:^=@p!`~3Xf6Rj.:]"̎@IC \ӧ{K@Cŋ,m+F1NM1 \FYzmCOyr `@Fo,fQAr4i9sa{ },󯸭p4mp6dyܯϊވAQ3Uu3kޥ؛;>s$Z3ҜZ WwfGlTp@nӖڻ?{?Fq?CYy|oJk+T|ޡ0Sy8y.d[>okפ^|Fk{75o;l|m[6+mg7r3(kEKۡX whރsߕX *4bo%VfC/YI&^OW[A(nѕd7ٽDWwz]+th uBN]=B5P+:DWظDW/BWV]+Di#+ɜ+6Aa/&aq.!|q36 8,Fقb/p4iqs|_BG)?&/}c}L`b.5_f|:.g U-x~e Ce2\2|l)|o^x|_Q߼J8Y)J)}Hu' uB){?[{e}y2E0_)Ҿ oR{@W0Bk F'hm!+gwFskgp~!qVG070Q|ez8a7r@yuih񢈷{' PrX1qI\[J-U9wep+J`4;/-yUa燴_~??..% }uNǪ^xQW^RkͯGp-e2kV;{1àH; jkG M9MAD$WmAy~9:ߏUxs0ʦN$ӜYf)|`J>hlUKew Zkˬ;c/͐׾oݬN9^sqƅ_ɲi@9L]_OkZφJS5/CU;U_^N˂͋sbŜd^e+/Xɺ`Z! QWZ2G5] wɃVL_BکJtf?^'VQP+6ʓ(OZ:lx4i^\ýqz\rfwM[^󘬅01I3il%;6>QLıfzzOhY\ 0 Mh)X'p}+&ڊ [M9P?>4ͣc) s2XsiT +ǭaOAZ`]NC|=m[lq>0Г]q]OnndqH/FUt+. v13mcki4B"؞{,2Nv;e!wePqt:5~^^R[1q*']FhqOoou|{a34aq`#(x*3Wǣ[ lM9-~W;EK39W c?V}(c4mp6dyܯ ~ݏGW T?5lʸ -l[:lfH./d5(=6if IC (q'M0T,K))yEPn@Y^1NbeOb:m!?(u/t:n4(wmS;&ǣ؈iE{'RJzXkM%!2UNV쀮RNS[ވc(ZrB$S(٤,0Wz^'ϒvt-8G|}?:0,J S|o`|* u%9,Q /R%62hXvE="rAH:|QO͔dGZ5-*B*T Y#2.`d#5zUF\;QΏW7`@zYoXgy\!V3MrLz $u3.4?@kMM7ڊ)8750"$+'cEaHRHg|Ͳ^r]iȓitO LuU I_`L͍SBX\?j$d2>7\4V?/giT-ß-\=^j\/if-Šwυ JhFQnџZ+yK*X{ arbWa&ٟ]Wէtދ /'zW\H'qlR.tK&zwdaL0jp֛4@'<_gԱF\̼K fn:Cl5t!Q6ܼ[Gq2"YqZ*ŧyy-7'_C)o^MPe-6\q SHBn}Wkbv;.3vٕ_ Og`4tOg=+gІNȇGώpٵ<|TkRN6U#6VjMaQQЃ؋8EGOVuORnUk]4rSjZY$8/<+`>?Y' W]>[P7"a~?.-VwůӏO$ávMWLB"4a*'h|2S'Eq"*ohK}3kb8\18w~yzz9|!ߞ)̅49tkso YכV혒۪[T-ANl^&o ZORK!"օۣn(\1̑I%J &J@G(1QII (񘓑?{WHpȇ=K,.p=vqI>%v4O O%y$[eʖ x||XdB,;g0~^za.gf-8s 0s 8O! ^trޢd,r~2 txrXةsVj>N4,y҃.ʥGxՠaz jcxu \? %t<T9Vgxahj."ha>Q;8i~ɺ>;U/J@dHegˢeb'QYǜyS'9xiu&1:+GTs YUdJʹ<( *SM)z༷Zf1Z9՜쩒y]v>Wժs` t{g ]7 OEwJM[{ ާzGTv$Fgx537=^t6{utq-?;ԑ"tT rx/uhtC/e9eoٚA*Pf孇ռXZxU>,olݳw_Ma_f.ptRǗ[ rg G޷ǟmYOz~uC'D7Ke2t6yZːMBX*3s Jmۀ~R_bF#lOBoFnCl:yw:ql2dtnXn~Bv%`bI ^7ota Bhn-d|Bv"dL YFMYj 65 wb旷v>x%9ap% a\U;jf(Sd(wo198US'1B*\{1 CZ:,ǔ4sipdQk۬k`ڰ޴' O8p樼_%@G'"kX162i)Ap@y&i49)QYrA- L֋RĜ5qeX:WOR@3v.CGqR K2e4 r}@yȄ!P$-2mR:*ѫZtNsռX\ʸ. (T°'7t1DoɱHAbޡ$Qpx.xX;C[qxx(~JK7_HCeՏ+yɖ C_N#,]V =B[ԩnC[]{ΣUz1;ivR yse23N|ՙ;5͸Za߿^(K/& H@_ifuuU X45l! 4_DX'x$jBlrhJF(vsXDDDV7숬>#"#U&TeDbpA:0n LVKbA'i-@x $T%G2ARX-6TOt.3[V΁2g_Ή㓜^,4aOgf`z;oe5F/QcpeD݅`9SbĈN{rN dY`0)ȣ` ˉ})u$)rȞBQwATժ߳U{ N}rB{MOMoCTv1K +ŮZd_3WeDk%Hc/E{Ѭ[i.ce辥ϓءOɬqt{+c- 1u;>{wӏG]~=2y6֍[ﱾ G,bc &;{}YP{Y7[hg]zcNm*{(ު6#@^꡵VXrrv'󍏵^Z9h!vr|2̕ _`5 n.1)b >`E6EVzy9eڃ*{d9:ɪS9*u#*[Sj罩V=QN*#ﯽ5uk1rWp\1DY*dq.ޫ,( JiY̯'56ŃMI4 YlQDŒ_5/iowƒI: ;X%K[o'g7[1ni2vz%J68[۸ K70 `u#e;/ׁyz/%U/-U/&UE35!_/21rtƢQʑus+Ņi],+Df:C&\VA=_aa2ރT:;}޴ּCR{_͇kN}|+Q#թOˌh 5Rŗ@u@j䍈46xťtC3h-jJ$3#fr=2)DYJ$z"ge)jf!L BS}A#dwvZN>-\7osK 8`vSg\ $1bA#KpC1fv@ B0+XtFpEQʐ;@j/1# !A .Q90ㆌU2BN}<$%Py7-1pNf\ϮTbU˞fu4pmCN yDIrфxϒ5LIvJRT|q酞yenl%i ,o=/3'RFLF3LS+ 69A+#Ǜ0!K _N*Ⴑ\R'2Z '6HXǡW_'߁z:6鳇ġ}z)ٍ ROd%TK^]hrøJRM(2)NtE&}Bg:z&c +V 0tۋ"i{c=?SW/G|?4tS0Wni0Ne2Of7%馆E`;O i,ǭCkyܳ8=r5ȏ5o1o3%q/?=p مl~Wn[,fם.NjC}JvMœ N&cSRG\[d=Mk3 ?VZ;ō<˱}2gl~GǢaxEgaH1Ȏ5vڕYM }T)o#tW|LHmmaI[¯ _^=Vpmn_I]e54pYg4m~Vc3kguc]My=0]jHuM!+^W)=ruk.6T9˶cZbZ\nOƴVJ d+e< dHN6^ h΄ߒU#[ǐ-~.g~ng~6gMT\2Qd:&gCY$t̿&m]Oma-( OX⸥5-E r\Tp{s*7{7ӧp2[? z,4vmuiӣc[6T\oZPGO穖;-qoSM#,C;rIr+o,!Ix.u;\8NAag5t<…mxmRtK;,V)Y1Wrax)ûyZڶ F=h@X ©d8X$ D˿`\P mQT}Xl?|k:\'1G_܃V|4oC=|d[[ Z0G6Q)l#3_ru˴ A&$BPJ$ia.;M^3 )0dĄ*>Y}ԅUYKk?>K>{o!TjQSנw> \ېH|E\rBg$ vJҪTx>vгz:ykc36HdG QyD=*)Lji8TAB][˞ p/7!`CxTcOd&'AhOm9 [ݪw6ꋾ~ۮ?ik ^CdøJRM(2OtI L t1L,h2AW;mC`v՞jlGoKK.{726d>>RKrwPAâ ߵ4%mhzѵkhp>9nb:gx#1ok=N9>"0?ShQfͿ6Wejp4Wˏ?Sޗġb\lC/GzwO]}?dHӈu0q0OwC/xO7)i=[vF rS\z,u?NJñGk+c{1h86"Ê2$";tm&`2vn%-qkZi+VR7ǎz~D ȻЌrq6TP`~xrm,mˈ~7}O|EasɞuyGf2ϓi?7Oi~LD|j+(X;f4wC::621҅_2+4kAtuvI5,ʑ"뭺P0cɵўkZن2J]y"pEbT*zcجweŅZm:/Ԫ{4ci\ݸ<~jvr%~k×ni[=(8Hm9da[^~+}' ^Vt)vhUȒ!}(| 2NZJmk49"Kc\rK*93)j\Rt^Gz\^Z-8 2B'y>f%sce9!>$L7v@ѧd[;k6;sr2()-h!f/%p|He(Ɂ+(3m(ΞI66Aah&of43sH#ךGøb^ jWCQ[WFmաv`_' xd>H: [qZpgF>lt:Uìq@2yA&CXH&HE8*E8WMu EvUeMkMg-.g[UrnyX{hFqv %Z^3{*,^ZrMuAh#*n6+ 4_aRZa CPH͡fFwDWAdfY}Da6%Zq$zctL>GPrdK,:u Gm0 ʦ_V<1Mh O!j45&) eSP)@ډզGQ4ī!hrˤ}o{+'b0̬tkFO㐶d1X[M!JfI)Ky}%nV[ͺ`]ewe]ioG+}IP}E";dq!i)H|dVARH ɑD)mUG?U]U-UVi Nk!Z;p XsA"\GN{3P"4+ ID"#:mi7ΑC*=Yo(g\ Ĭ:{ ㇐."kdR 2 $!xEІ8L~ Yx0'AfS?ou=/pG-RK1ca@XT` 2h0wQ25frl'e;i&]B/Rz0+a+_a0J킢(gя☄徤h[h"}h ߍl)CL EsS}Z{f!2rc!YXfi#c2\o,ўEkO( aE%^2|4 QiPz߽ l5{Jj," ֊T|R K!zE:T%ZAL}QPu+¥&L) #(K)µzq6Y 3mO.V==\Ak-+t6rp`k6[1Xh -6 jLJ<\y!'Qqg &:_u`YmKc3TxGmBɜ%a6T]I$w 6NJRJpboϳ^NyJ5RJ+<7^ #w\,ؠ`0 2O> ]ښ: ED6A'*SB4ry^ijszBD~i52BD,(62O^T'Հ"`Ί^?q|zSm!}s3Qpa}d ))+ކi1kPC <6kܻRɥqዣ}n:'%jxQ`Gwet$y~&E@r  .`f2 `25x1l*"`F`MH7q 6  l,ș;oO4 lu~ S ÇGu.|*.MJU}ִW颣y棒hp)yຼ61$h6q T.~|2R06~=MsWWEpvs0y0=W.8Hц?^M Opa!'W=]ꆬ ϕ,kCJG UL;hx]ɼϳaJWJ^Iu\dsN|JH|3W,'p1NokT!2kgnof)}Sxy6q)S{(R zIA!9Jk4_T6Dtkk*Sx6_~x}w__;L' π.`b!4a7Ǯ]+h_]CmT]+笛_WRqf _b;U:~)l6u,޳ 1"Z hA!RDf.`ќI㔓l'=áU.{q4y+`s7 %-XYA#Z(,΃mb\s }'I6;O>Lgi)`XCw'tGht;- e f`j8qcK&"*-ѪJR@ S=/O- Q8Q4zP,J6*o4X).B IT] s [G”UX-zg՘1jye0rlAtRJ#6Osg.Vͯwчh4V#św dG{s6WMon0ޓd @W 3=/={7XzcVYEaPQX__¸SƉ?-}޸ߋ8#u LW9L^`T i IJA,*Z,DGDBBTsTJTa<5w /ޡL@r|iW o[n=Ks5=th?~Wi_ miӚfgr*W$y)OH^%ịUh4BkPfTet!ſ9b0XJBԀPE.p[q3HKq,qUٓB.vXmWdq,K͓CnfY &jD\uXqYS0꫏ (@A EM*E1?1q~I=*j!]$AJ$MזY#"R.5EÅ-+{w9_~t]Ա (jj9lX~6ϊ&TvTM98<uj.S‚ͦ4+x)Ӥ `j?5U$Vuj4J_F0bz|Al`Z g?)~常xLUHQU|y`Lǟ 5bPTGł ^鯗Z¶$k}Z#ikNb ΟchƟW@źP; }ms [y|Mݜ046gݟC/a5:js{Z4'Ӣ^үI <^{R*.WИ}Y rPA # T(dT>] 9##juE[d `r ;Z|7GʭjrIK3$yR ytBpv ZVኞ9'%!e]*KCi(c ՇW7tdX~'Dñ澡 Q3?EE&Wس 893I"hH,J j2Ni1A%s x%XH"\A0]!O1&^G˃_>]x{^yL |Jէm7_-w&zUfrk+:M,G1Ǫ :uFͿl%>(UyyⷃmDrkKw*srZY9k9g-圵srZY9kg- 7, Ti. 0|. gtԠ#&I. QR. 0|. Dޠ;#.q=WRw=L){seT -3r&8;L^J3z$HZsogglҜúq·0EI%3ȖXR?p2.>ԁ"}E`*"XZ9Y0/Fn`4 #'!U:Ǟ:^)Pʍ")AȰ,^ -J))$|p>P\B sNg^N{x9 Opo3y$装_3臓Ê @FH'èP\e,We҂LZ)5errRΞ&0z^E,@,"!$:̴$*d2>R8(A8:KlMH٤L5 oIa غb>Dp~Z.lM&;Iv8μY~O4 >5\3-9EBTR9I -H-io`5#RHH޵qdٿB"qe XN`cu̱^!%'bj6)JfSղl#V{=sxҗį0~Y{W,'VW^>w~߬:82o{5~=cJ-ҤlY/2!BV9YrG`8 w"g$~098B [CgV 5;u-@hb͜)՛mmFV;I$yFAF95?zA #WT jȗ2k8 5dshlR6¿V0|=~8q_t:zwzZ c CF=faqaG9_[}..rS+a0>s8=ږ>m 犆N|2uNEnutd*cpK?kX& u0EL~.-޺ݧuj}Ul>޻se׼ڧy/m|دO!?w -[TNחX+52[I=du^YiwՇhjӻp|+,/b:<ή_Ӯ["Zt}nXڙ>9zz8@\hzN6@J7gn*,aټV0Ը-j{ b ?E;ԢW=h{qhC.6yok҄z[|5]QΒ4|ر^r4Ns`aj=TseJ,ezz0b'ޤV:or/v;1w֮t7nWE$xmkv&cDX;v={cwzF0e/}ڼڣ h[O«^Pʬ=ni}Z?P]{_ ,5)wl.{I53=g7wZ차Q={wb@wy']9κN͠g6)]̦t<{]pM/xZ,/& qՌlîGθ͈1&N=Rnuvz;S T#iQZw:e,3Fzoqr4B+X^'Gy~ooHFXO<Ͽ-M`q ɐB6.g)u;~ SFfRDIg:(<0\\=F7Šga7w;1)UmG_Vy6gQֵr+61Q߇ +'n'&Z:@VXdV&k,m _e~M]o# 0ݳv^Ng $`0︑O_E.%/2 d9P5:MN;ym)h$ ՘$ %b肊 A9D'хSZ ΢1f2ƅYRu"[N NΖFizJ%s`ITc5K`uQAkѵ^鱎.!'|VcRJ[WR8ʪLGI\e@Lµ4ܝ;Ԅl X YM}cRFG%Q|)2RζϞ`-H=ח*GuJtxv 0iФ4IexPdC}>FT͹Vj 19g&  \B{  <_8Na$U2OZ`"<%Cdc}-;'%r\ EUNXɪ̉Km5DA*I,3F[Tj;uľQ210i*z-AJK.ؗ`{F##+$}F_+׌I(̘ P )HY$xHHaZbsQ&8l$ @XēRNI{ٱ }HJTapVV吤@2k ftYx.Q{V YRj\$$xmjQV)$`|-KVA @@1%[0] Np sVk!N=8Up;0FQڍLڞ<֕ȴ:eتB␱Ďqk P iXgG}1\' U4i00Ii\Ȇ9cϽ>EH>yrE\2+OFWV) v1v@`̄ KCvӐ! j ȂA m!!â*UKdIFySU5Fx"f#vC7!v߷D\1 BKpPRp 3Cu (5 f̤CD8&VJ!(ޔ}LEwF#%Ra`_hԞ%^_ d)"StFb.5=*)IE,)[pp5"iD^,>T&[yHlH 2 RDǁjLcYV+$SN |igZ筍p&2Й:p4ou^KEh3fkQBq"5jpB0waFY~;Ļ.҂kVP QwF. Au^R;'I|txabu`aLK7#sdEJYRn2ÍdHo,r68/XH J * <@ "9-2J 䣢R0-Ӆac=.H;t_T %knujě@q+2qt2΍dA X?l J´_IyTcrjUD)VZ)@-1(;K8.$nŨ:"J,t˅+jTP P0 lwpYD3-hGX3yCY.Az |< dlwXj3*g4`0bhuJ׊B0HEcnXd$Y&)hѦ P e 7Qd%w 4(`HY$іfNN5%bWtal3,4NA5VeoY r*M{Et42ߨrf S4y͸Z,AbMKm0ڰ:vtr b6]Uvi[\۔ =k q fQipt:6&7F vM>"E4QSnڻVA%RVe m2&9chҌPY,}ÁvCPB^"`+(6CC !7"d Dӣ%\b:!SAAJ޹q$ ys<8ٓ}$X`B_%)_O"J"EjٔUbCpg+xJ$a`Op=0hM`28-W++&U.j (@$D5%i\4'027?Y#9mנ\U8/pF ` Tf*ZC &2p(o`93#`Z 0_ p_|J00F|cQ2Mc9ml!& *V/:CYv)&#`f%xZĔ H"6.dI\N4 kHTvƂE8Ø|@UGy&B<\8OW F LL$esbu1k￯~Һ7r4 g Oŝ3`rH9@čl6ۈ2./7&X\O10״Evgfd;=Pطtzzxpފ-\af=&T;pKGw^wrN,O[N=w?rڟsn# dޖ) mo!}xa=ܫpfG7l[OO^뽭v={g9EřW|X |ŧ&t]&Q.s~-'pS?]ԕ^/_oww\ef+•}|8ϸ y5r/sBt;| ! C!j\(8ZB(/Bx!0(D JQ:At(D JQ:At(D JQ:At(D JQ:At(D JQ:At(D JQ:AK鐚I*`Up6@2s@l%T' <o}|&x XCJ ISw^:FpXC{=k@\*3\ܺd[ =wK*6(uV#p,E֬ &DsdQ{|C0nTbL)BUrV (s%LCSQIA\jJeTQ[4Tfٮè ru}u ,|rR>h -|N%D%N|[ygPxgx_NN{Bnҁ[o1@QRJjŊ9fAWZ -cl|fdk Uc[>-^g|~Hw$M>N|r|8~k 2i`ӂ"(ܹ `72*bj⹩ŖNV1 )lJcʫl϶BlǜBȼ]kl gŎQ^ 1j7}il5Ym`TO1VŐd_MIefc: bP <VmlHi B+sɠX5dD.pF/B bChl g=ru5 XVE-!H!g[ƽf^T,\4G HH@9b=Sޫग़LHm= -bl vq@Zg3).vё]$[ 疕cwŜr U{ȭtE:4ŧbIǾ7~?{&a_g Ugx( 6|dmʹkO%ݢzR讒ӦAc0,WVb4NE';}dM". w||ǽ2(ߑo2V^'(WDBTy֘̔l i,L"7vTt[ g=_ARr᳭mp{elw5k^Ds{Wѧ(Wzہ[N _۷G+-5=ԷPC}{o=ԷPC}{o=ԷPC}{o=ԷPC}{o=ԷPC}{o=ԷPrhs6>=bv}{`\SߞطfoU>~7Vn~\]v.ètV|.J5iPg)`U)uQp x*ivs%B\I9vs%2DKIIh'cy@!IBbcieJiI{ aI{*>RiM`/!89j&F n0ݞ~繀i̓]Ap3=BcҰzW*"msvDkS%nQ!P g/ܫ cŧxY1z$A )OhX*n,=b(_bu(D Q,bA Xł(D Q,bA Xł(D Q,bA Xł(D Q,bA Xł(D Q,bA Xł(D Q,bAK9Q,3cP,`P,`<sOHkќbr7)<0:T- \Tb0$6)ݠ_Iy߁+ݰX6DZ3Lm셓HEܫ FYVkF%n!^)VNEe[lLi%zM͆m/^ aOWr@9ë-ϼUx40k"s I6?~jz?~PD0$@RdN2@֊ R`O;]+H$/U2GLsC`DIώs&l.:ATP2yԨ`4I\kNh3;}ղȼjg͆zWfOTs.*{ϕYifp"lC,6KEX=t:%hkԹz,wl0  /6U*; -1p3j5s$hxr+W0f4^% ;jNlڤh}< Lᘔ#FAnLp}>(P0g+lJy=X\vQ lq#46&Wu\ [|LR*1YȤcui|`vz=58?tCiۥ#PEBag v2#,ag ;KYv%,ag ;KYv%,ag ;KYv%,ag ;KYv%,ag ;KYv%,ag ;KYv`g[a[f#L5]9$JɓY ayQ!Qͱ4,qAQ*ΜO\vAT@+BQ6(~#zcy(JpM \(fDL ua UaB8P\ru1A:RV^zۚlEXxkup '!U1&$&)q.ǵ𰅯!+KUCܗޙ|Pbp@QU6(N`/gD9;XEu%k&1w apy5Q"i\SN0b,aDȂG+a*0 YbV%Q0Hmil =c*7RXS#$~J7Bnh_9Kߣ.Gp/><[dr\ 9-}_Zj*`KUU1xPXxRW0{ OE |oxuÓL@ve/C0PtpWK&.)n1pdR.ǣQ9ſ0\9QP0AQ[\Y;\ج]P?0]c:A9*)m ԎߖЛS'z[xW@珧瓣ApݘRk+` )W'7T/7ʾܺу?T}Sӳ/^:g!h /@I~=?? w^0˴$6 k!Qw}ˮvLV-.-N~]7\wkjyt{ DG̝/Iqz8ɾI >20 10I8U*Հ)6$Ͻ0.{bA7]\Կ|Gw]b<^{wMr\2Q%K`,1&od/c$${H.&e)Ss!vـ c4IVSxğu>z7UyWI)o:r8*JǽYͶx{85˷kQHt WS.a%-sg\[SA2\M‡sϾ pw%u7~c>ϑ9"8*$7k U/wx,rwoxH o[D]s]RwM0Y|өחqٿ6'z8ZAZ~%yvQk{"~(ktkKnpYqvVVf4> QGߏ2?{F/m^dq}ΜY` `hbKK?ŒjdwWXRHh匩^ ߦZA'ͯwݨXmom>o9{;7mVt+hyk$= S$ ɅZKoWWO?(5'x%!zy^mDuҿoux?, {];\? H ,0v2R.~|_l@nil 9x)\9mQbVu󘁞twvy,S4QR j$:&W3s:.!P,-x%qNרQqǸtM%Jч-xCv$^$ *437>lzg"j!N|#$[# (ƢI+L "GC@W*;Go_NnXbdO_o *qf Ůћ-7/oR6h)2ѐSyO?JE:xs](=?ޟ|8sy97|d xԟzfz.s.KQB֏< M=#ND]BXCE%hM.D+1[ HK,0.[ r'-g.g.A?7V}t]c0mOa^ FT0?A<+7J -@wT 5VX'x 94%J#J^q"Iͳ3HISi^:Qs% ĔBr h&͍ uyaIFKbA'}U' O-y)goAM޿L7,ἱXmB\>(v0KIp&@ScRbĈNaNV%ꚧ /@W7fF76gwLUlOU/_'S],-c=e+кrpw% \ "DR6 i '^6ڰX@8NFsalG($Ri01UB]ubEǃh &RDE">T.PRx9zNiTj < > C.,\<Z[A8+[a$QFδIZP)(xʒa4= _m!zegG6:eb D9(ͪnj9JEn!.}JxA /2[DV 3dPk$sVVֳj9P/?~ZNJ%uh%=!CZwddEQ'w˱ddJ$ӫ@MP`_X5^l5 Xcq.)JbCFRs&Yd\R>[V5W=OyFsMKx!CKaԠ J(A؞tƆRݡ{QpUZj2%edP&Kc%Q;7^{0Y9g0' -d鵭GVڗ`zѓsQj"rʷ(qz d⢖VFZ(U\%–1$LpBJiBգz,XF;jIȱF',]]h{GԤi4&_BHPE@BlL&mC"R6M-vֻe,_>Զhʯ|\BN) K8idt>~NzQs*w,"wܢNZcJ)A:L6$$5)f;_mᫎJ}#Yj?V$>sn$V4E/Z:Wmh%T ZRG/JMj+ o%G~+zn(jerQ\B j9! <ˌRss*[|Z(GCsmfI+`^D+ < {EHǸLҴ`i j?%-}ԉwvP@^snfj]c?JxP mp;*G/Mw&tȖ 3»Ҩ 鿾9MOmzo"q6)"(^-&8㒉⍡wepIW)teOԬ,rzN4%+C fu{z ð(Q0^`g#66/5f}]kJOkm7\N_~K (-'/!hEn 0͖7v--!jo;̚$ٺ~]Wd=䇧|>GjJdW$$homWohhXHvjH]Èaxv**?46`ye%z0z8qs\Q=&FWcAfuRV$>ߏoh^yPƳ:E&0V?~v9rF~ԺҋW(+zF뙣x8ylk5;nU:EAKJ_^?\Wx^):1SJ?y ?=i?aW?:42CKFQ|-wƸ)qulM8tc'GZēങWΓW6LbRj&I=Y8CZ dB,;g0IpX3< 9a7O?ef$Ԭ1$D2 tM<uUp5B"죜6n;mU}ˤQJGClZu:FXTЙ5^Xl"Y>iry `8p$~FH\G]['9xiu&1:+GTs YUdZ@A* Cܦ=pcp[@@-ɜf]4I"j^ҕ1b}Ϩ8tѰхn@M`Dǣ|S!V~=[l5m>)IkW+h@`zig8x]R!AiHe #]"魪 rMmH*/D4OJbT*zcʊZm:$xsS`i;Þl599jOv<}Ov >YUjGcIC&4ht)vhK0KB8F&4X2hf'"&ǘ=h%n)`@eb^D" Τs.mMgeUjq-ʶz[xR[3-ȸ#$L&֠qWW㻫d[luXTɠx(ќ!E^+JĽ"-k @(&^7DC!{. &R) D#l4C̬!e Vզb(PsդX+[m[ hT2>H: ZqZpg~F6A0+mJb,Ł2KN ![ Z$ApL$s DqaV~{ؒEb"֒c-luo{rS2+202,3Y,&D H F 4k! h"D1>I,I рIs#@LZHYk66mW.NK&%El]%*aXJޓsF'| [,R.{Xؓ;$ P]<]<{XM:=ݟ6+~|G?.%,[enkn*]!i3Kƫ<ʆmJbh΋>4GE}ZxQHy O^+:Hsύ֑Hd4 ˆ tvFe+ޡ\m%2sOWځV Vt]y CZ+wՉ P,!FEk ǫ/Wxux)TY4 }!ˆF7vq*Z0JFnCCB,Gcz{Շj( 039*KȢPER *)@ńH_DX9y`#ZfgAom BF9$G_;tjRJTIDd3oK87}v(s5(؉]-=:z!tD21_8Gޖ'@@M|T@I*'(QM 0xm*E›=Jz Jv }_ĎPe'0c2NnG5o)Xuoh-\<bݕ4=% FYLA2Z#ӄIz!&霹,#e\+I@;!ڧgY=l?]P7/o`۫Zy@2 4{F)+k 1[!zwh|7ث-snX@Ƈ=GQA~5u[¢Q疻XXg_(,כu6δHf*;` A=1+Q7m oIS)as}yT#$=0N569lPă 1^rO9{)g1{uQ a8!UR \Ip|H)?{Ʊ_b%~Q*#+W%>8/+ `,HIN忧gw.A,$EeK41u=ͩHGuzHH)^FBXY ؛RSFDD b+CʘtClfn+g/fM;Ӯ'ny޸i9_UӦ~1x4GnDh,N N|7%>8T("{Tj)Urzcsa] ` YdNL* ca'<ʟzNҐGrR~nKa4trx[+VX) R *Č!zB,ȓuƉ/G p\<1r|Rςc )/V{C91NjLAQFk^ G ORMo_@zvi#CC2v AFQnndw 3K9@< p> R q QLA#uGbՔP), Z cD}@EBF)uO03"yDmP̱,"T*'dP1ZΰI'D'|&\eT:d>Lo:ưvΐ⌧!^~/`~+~†6SC :M(O}^ Ga¯'C+/y9Fm]1ֺPX-ϩ tfuxXI~7Vt;y'RGn0*}ϧT}[Cq9Z\]s7p^Umt  ?^%ҍGˋCWS Ja=%XȼߊL|g[u4S}D|dn6D{bPHIQQTd7HlIо$X@wHvo"ɵny=%6R y3R hou>F!+ͦSWR%Zj\ sPJɈiॼv (&ω$`P4ٕy؁Xp;jtY[n = 0>Aخ^t$԰v_hUD!kDxF f"Ff 0qsEyQG`*:vsX2["PGJۚ咵VԶJR|L]C?F臹ވ~L>¿ٍ+H3oKA#.6;BV ]5hy 'PǬ3qoӅ] \i.M"хZ%b[ٗ֝$ۻ#CH# 19ިG"\zd㌷k~ٲ_*!Ԋ P9 +uQTN}9yBF:8o}Ȥ}mbDcizCkY>kѐ^"Z;H>_~+΀*3vi6L+~ X04UUa@>Ix fǞLސ 3#>v~4zy=1!L Z`0̰QKޏ\~Sv8:U<;@]Q۳y]*wPY?9/CE$I\'U :(Ck]/" }U_w2i]f3jݴ9Gr(<tB\CrN'<[MdmR&=joL1COI;3O?+|}B=wj^/>(apgf68dzNgHo2g;o`l~]?u}{[~擳.pk[Fd];+:^~{oԴ/Fzvd|hc2&›Җ͸1݀[zU/RBK*)- #Ke)x͔E1ׄo݇Vb>X8o/OŸZˑq@%x`ZUL3`nw/>1>6oSئ7O5OiSYAK/mThZa!A ${@*So (:o;c D佖ȱi4BZ"2?6a [~NɞA{l~ܦ*7ϠgwnrV]0{w$Boי;h߭AIOgf}L6ȭCkjIѕ3^.^c $eV3GQI :NG@`ˌI$"pQY'=ȓoYdͻHXg99%ƴT*e33+  9<|IHܛUPCR$ jxT~8!`,$S{+%R.fnA urfi;=#K/<0߱ 5 d9 kvȚM9ǨĠ,Ed1Q8"u1SBS + XB!E%E@!e_01VtFz%'l7<"\c l:>60t.-eu5%smPΡ+-h!Ԩ,nsThy.M*X`es RH:`aQjf$F{<6rr5E17LB]x@u_3WXÔe:| @#E̟jMT7Qo,2ڮ/dj%4]) R6i>+Tg*jj.2't~;jc~w&\lGSފȈ荲t&PRYdjo _lxl:"4\I,5_&iԽՉN:yR^5 Ï➆lf ?^/QUmb92 n´ N2*ecӟd[:G"䂍[~ۜx&<^h/ކ@\bQ^atN9&Xc(e*%Ŝu57HjW463/G+Q' 9h YtUiWmnbET(gKƄ;lgͦ!|Wf=G|9xip\@F Eha$ fpXn%IA>AlGh_ /r@ :gDDKu+ BB* бXtaZ͠V<;NZ1nP &kY&S JB%"iJ6 shu1O?])jF[h̒-KV{~o!B eaVמT^ڢT)Yv`GnkṝƃEL?f$-ac[܄=:ɠD`mCK B1 mJz4G;XیB՘;Է=Wp3v"Ut0)5e0&P&tOn~lO>uaxDNC)T95&4ߓ mω8&܍9n"xp$Smr0CYXWTR[̦)Bq 5Im-AJj  rQ) lRYVd!Y%9Zi6 Vt\^ˍ;sV:Q|BΣl2ILM)` ŠhJ".NhcVS9V<[jO:gǧhRlX"CiU-*٠`ya'&G'ZguFڞ#1F<229'%,: ⒢6 6$Y1Ӑ,\iD `Y铄@# ÀL(+"t 9Y{s&)HmiֆU6x &Aޘцbzk~{cH&_31Pj| 㰂?kkWa/l?0 tdVjMW%4|YL,?ǏއWKgHu)́_pP.Rߺu^=?sVn!A DٷN?pћ|űo2,J,`3Js!M[(9])ϼ /ƛtgguWcOfQ}̟''%U~,zN:psM$pyL)+X=iuAmJtɍ'Z=:cՑߝ,xξE^#i:+Ewk&OUjiJIG:FÆ03aʳ^F Uxvy1Gӳ{GQ94Z)i^ICXo}>gU&o`r|UpzMIM{?+yO_Xl?/"/%U\~&iyɄ!7.nqlrG5Lz$NOI~}w_~?I??{9ˑ՛)8m=LXZ['g݅)_$]NeO!Di:x~kJCJ;JcrJRF Vd4"g!APӇDI#p%׶9t:#"b1/nk.Vb9Yl#_iި)&CKFFgj$a|(" bgub.v ה]Lv9à򔝎lr2BQ֭Z)uчDѺt;q.YrZxhm]߆Ѳ|R a`KY4cbYiaX)*DRBw!QZa.rcc0a+ qet=ɞzBdU%J @Z$:G (WLB6"Rėe =DB"'\ Dpbp.A vLrT!ثt6@%$0;GY-ϼ z؀-آA 6RKGBG_]CqDG/z:tjb~|獫[!wG%Yx20T ܻuul%=% w5lPӰW0c>LgZ?>[-_c%3˟6=&5QV;uL^[cH?JyK*eR@`42X D 7*qr'^ʇ</xAM3/Y'܇L&7Ee_{52?ggǗ\:F_R:' ̈'p\ 8X̭ذNz*X~iݍM{>IyԮHXu A=|2~KvȉյR*䰅?%` 9UIuU'6"׏n^|A^{2ۈ6Cue6,6p0՚9W VO[Ubw%vpZ{hWљNa]Av`ʍ1 +Pg (ZS'T/($o7(B뢥1qH תE$IC d.qrYG)"]~b!X'mޫt(ޖB_A>#ݜ)~OCzMO%P n%8B>iMJ^Bj)a-JәkaN lJ.$eDOq:cQXE9^g 1>)Q1fQ2Brp,( O&Q`!t잔yW[=7޲t~&+ۗc/1ӨCYz?\׿$`')]GQ@;_[;a:%\7|ڈS״r%1$"G2C.0`,O'_aY cS&;V?E{v _oْ?nܻ<ʱ<Ӿ~YR1W5~<^oWAo<k`[GPs[eHdd#!f.F$G0  dI?ҖiރA%BJbDa}SV41%es )\I4K6@+,^pͦAygHawKfJ/far3Mn1=5xEWNSLJbR&F&UjNH";˂ٻF$+D?,\R^"o cn1Ƽ,`yUk-jHv{ߓ"EQ*VjQ=wǓ8q"uչ  \ AGJAF3DLgtq=FWo#1ұH/%/8.a #;gJ-蜎B跱 NV?aN7XSݖ A*$Nh2l7Aȁ ^ #pM|x-eQ"'Nq2.<מI]r?4l|n٤ü_o&晜8fr}=oWƹ^ۮemfm-%/Ms19wy۟ڞ_z_L#] cg '!}?\V]^:JMXϯ?\t2AuBxh~N#8{pK5A|kXG;]ȏ{V.Am^vtf[w{`7sbzζzx.=WsWZ+b%hm7֮{Y=m:׍%?_tv5K{v=\M}D /Z G;nU!v4N;6t3idzMf q Bm&+}(ŭU|4e섨; #2_^@? wiIYIL(L 9jq\&C9gX*N7U 9>ԃ^}އ6£ǪW\(I I=xtǢe _OԄSCV3 RZY试~JbVu%=c%~+Ա[,eCOz,EbH w`*([H.91aJ$mot~B$`H"o RS%9ΗeiM&B_ʁΫ$siT'./77~{_/-UEMH)5S2EdgS6\04l $f"u3s*X睴ilHmB`mTѤP@gWiW}'c9&c!.D Tz/@c)RxZ蠭2r20Du1NFs;JpWg&f9>il Jr\K.UG!pп o_'k'6` 6%A M|v2<ݎ['gNs/17x;eԒU`C.[a܇W.4,q:y/=/tzD6]3کKDW?//׻َٱoQezs'q n&MB!HҞ}FZD*r) #KIަ!q$y|[dt7{M#eBŸ/.uo_UCȠɸ)%t|*MM=%EAdN̗#QEZ~~\ҧ;H?vk.s<Yϳ7^u'BL gcZO3 u\>˴m({)|9N:lO~֠ ^E +qLٛ@[NSPjZ$: RLN 3/eEykn>go3~ Ĺ,,hb5se3#D2c,W>$v2Aco_n4=Ui|i0 7?d4W)L~]sWu@]OB #?ʿ&W}hΗ?wI1$QMU{>\uAV'bZ|liqMy-D7H.Nz^\^-F݆躽NfsCx^nQmy<½I}F}NVR:7_ojm_\MjXJ$嚋A_d餧CT_U2Yy9"sMPD0ٮR}u4f_J-Qr>0N(Z¤]Vֱ?rr v2[Ұ"aumIRt؝fvip\cy,>ԛ׷nkgb.Bw;>oӞ׫]VىOFbNâ|evMΤ>xf3m9G4ڭ+lad՛wf7~3?շO0Y?ܖrk)N?$|s9}w;k*J}ova{J;䛛4- 0BJ\z'2kS߼gKUmE>0}[kvVw^$/E}Ks+6+jR/q1hL渷U M9q[)XS#Wp+疱6=7fv3f9-֤J5y'Ѯ^Z#_. ufX>vślVr}2L?'Vz:RBOM-{4 U\c`/Wi,g’RZ*,rs*ɐ"q)\} IT&̏(TZL5US.TLAHmb3CI!2W2IlȩZ`P- u)oKDHZE͡EE072DQJ `t)RdٺLdt6h42#er2O -]5`C#Mf6` BRa,!ohkCXhX>bMOFcZERZƝqZ 6a`D9hO|#QID hygJ \YOcĘHg+Va9"s-js,ÕDV,+Z ƣ56:l_ xWΎs>K6;qu8~1Bǂ|Ʃ!HG=q~1%1i_jJmpՈL_ 䵑^8RXZ/+]+H7,x">f؜:@i|?AiQniݿl]b^\1+$8_= *ˆJ`N#&8؟S`Jhv-'4q|;fqtG=H"o,XNKjAƳԁqR,}XwLuW[(Pu*KZL5tp1 !pv> /TXz5!<)xPiD2+C`$ (g(Lt_X2Xaqd/ eKineAq <q4=%,:3(`eъ젫;2 fD)FB\#AIt'Axlnw|vg'cf^"j`AX_b=VyhƇtKz1h !3i]%_[YTA\Yo#Gǿ O3 ak`5iyJtS0>CĢ(;`&UQyD#*~Y )@"d@0 }P9{Ռ:b- >pAN2#s40إ6VCW'(Yb(&?A 28b;7""eÕ,c!W\c q2cm8|*iPk6֜3d7]v1CkBs4Y\Df)UChT*-~/z"Ƃ ֿ^039N@ jI`C:qiu0X ;[N46b篦}L;Nes F&Acf@k M\A骅 7x Fg`*aҿ4XYuk]YC58EMaɪFb3c\`ʷ-? }^eAp vI W N.$\14,&yM03t+1%! !0,Jcx0RX8 T&^_ROc[@2~ 2aQ=+NF1@sJm,2̍Lu+5|!uU gFPq|kYI D C5%i\4'p2˻[H|9(Ws L/^ZPٗ9%*H\ p NapJf-th55@2|.4%8nX#xC ^c>lXӦ1rIURD`0] V,Y jQAD Ē}/ p9|}dJ y]00t§+Fw*;c#DoaL` @,Bur]'Q&kϹQ1 D` ugN/~uS6ТQ>ﳕe ]W.'h0{^sW4X\-͵UW k{A^p3tojq,WWJ) "ꉀ+C4zcѫp6MCW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEWMb\^JA@f^kYa0.C{sWC*0NW7p2o>e݇= 7X^c q}'p7%mrZ# kwo047Fo;bG(e7LS ShdTV cQTM ޜ=j h1d {^)\F%iU#hb[+3w 6QRI5>@.Fo=&r17/ǿ'w%k୧a(rVMrN@䱴QN~L'3 Y1E:0@ 0}l2$YMvw=?Rx/V|\=,_KKo9W[Ooh`(^l1Tˇ jV Xr|R& .rMOϓ_&o_YkNA%ߏ#o ;ނWu{ 9gGCz[/߇}ߝmkuk.yx\Ia30$, Zs&"FKiT*X#Jg|,cS䵝->^ql=ߒ6E?_'eְ(߰D~UY1z,`w\Xa2(~yn-Rq)8qϘA}۷l7f[6{>)$XlgXe:nj%f;mk/JHicw/*k(44Ȼ4"JX^;a)HQ3pL@>@i!AU>9Ǩ$Cvأ:7r >烫qx-÷XvK [M$hew%I8m#Q*x^dYsVUXŸ5wJ.{^ anʧ! m[|gW~lD.rc-7x^ƃ9kGX2>)d0khoq-vvo~O>7Cv+Ż#ndl5%?h캭ՑXt7QdpO?O0+a&~ڭ=?Gr[6ۦc'LRJ/w9fW·|`>2a"q_D摬Q6ƗgDg-'E<IJV/ u#]@>q}ۻ9[l}.[$Sr" nA4°R!"1pﴐd-gkI>{LfI6I4!a/)G(}z9~Xqۛ? 7nkip5?_\:lޗH\u,\&o8q 0)rI15TuEa;^gp[8{3Ugy,d^;r߽X=a2 k>w…9|}Q7.;~ Pߎ o_{)#hqEY%vYF.eڨXMfFhX?#誨u]|{6c nbNrp#`J04g2Up U7s\3uof02/ej3h'衔"Q3Rl_I@IDցAX4{׉/fIN֛Ңc?(3An7=>8NAwO_4Iz__O) NYadI*^ ~ d7M`'= dx>MOO77ڶVw.'I[q"97x;E ~5q #_u:wLIaϛp\'Ӌ2րNmmpoH"}oe__c*_^=ŃN镍#AC36˖[uxv{g0zl)nz+S.!liB)D3qʼ@J!U G_9>:HRH23'J&t6*8OAUUY:Z"׋Y3+Gz)ƍ7;Ž..W<$^{ԅ/WbV|ϴ6FOH/U۝v}р}k#!&m3I„.Y[I;"Ils-X{_}SA()f->tKp3e %WLX3D,Ɓ8e,4bpE)u1-,s1[k^~D͜=Ѿ7}1گ<%h05v_~?߿Kqzj}}W%w.W+TUd.Iޚ$gxay*k ($4!K&%l^:ƌr^.C\XyfC'Hػ6#W~CufeM,o F$Hj됻r VlK*G&N''x h677g KْzŹ,0l,hYyT%P!G+o]נpMr[Hm vbЛmonPWjDׄfUR$t17/ǟ>`4$bøb5N[D! 3Tp,Lo`7]U+" |n;*Y\qޯ)vgc;;V} GK\bBE+dVx'D4K4LI}Z40֐- Kql$4 0{<h{ LFJP´p6^&fPר̒o)f(C4=w9wan;G]XWr8UIF(<!5 ܉<,S$9)de=Xwf'ldz\8&K>*t)V "\]m8Tkr|@_rnnۗی l=~^|r*0 zxB{{o.Gnl%IeQguGB&ĆZ7B#=7 @ɝz2Y%ibJ!<(0<7b6]}Gsh?pjџ ~o X=h}+<GIp&9=lUxsyU>uf%{U?4E]C*o-=w}&jT箄b R(\ʆ"FK`6U lj>c?l=8"reҁ21uB]I,xp1r4Ȕʦn=M,vy \I[=`2Y[,6ʁ\jѫ5.UR[24 YR>"ef,֮7TJYK<+GtZ M`ɟ(AgD,>"B4SLCT\B k''L̉1%|,7خ)g]Qj"rʷAz d"H+#rkf=`e iR 4d!QO=` hG- 9T;TeP?jVF~8~!ioitB-((>䙂McS]KhXVoQA0Q,JH)V(/9lDG:2pX~ېE[I#Anc xit ج̈́ʈ 3(?v5v:9?ZE%`^rΉܒ +IgΕ1:[łT=9--Y֑mi t 1 kAaT0r*EC\4"@F#l4q$ʺv2DQ,<)j&@8ǠM(]D GgK#Qp$@(GE p6$\{M0/\^*!>?]:]d&-} s"r_}~B=~pw?h&*?5_k_C $bpNaćո*4LOӔ4-W^z MEf.xf/[gJuɤ_0I> ,-"(n&8㒉KvrnpNl.d6'3xqFY^["DX.1 ^T`p5=a8kϊ m∍ `>zpZS|Jdm^k~&FXb8Dl9|:\Ĕ2 zbi Iӳ(ʍ2wf{Mŕ o ξ|/8OVڛ'{AB6jަ|э׏vRmڍ=)pO!⹱ۢ0QS,+d<^=MË|且{l{Inu\Q74+IJCJǚ/i(M_zA>,>(2nð7mR~r8y8 jS%YJ9qW_3L/ZQaCʚ-S[Lq8^\GR?o޽ʿߜͻg\y{7P9130mJ+.} {˕"YdN#.{$5oeXm8{reO[>g7^o: &[ې==a2RϮ _4e;m%rr޿msc89]\ő;ȭC-m.]Bۛ.~ \0ƺ v4g֭Omlf)7~ͻ/V-ߺsx-V$^D@mH:5/?D4OJbQq:zcʊZm8{d{8 }SdA{wQdumXmUrU%@I)%E$cH mdHS%+?#M1fOG;Z0f[ z]r&E ZLFe6ݖVf IơPUGwvsWoH[zv= l_иhe4|bOlYXtɠy(ќ!E^+8{E2Zւ PLn22+z (^HQ̦Fh2q;6 E#C,l gqE1bIǡV*[m[n Q>H! Zq 8ȌZ{(F6 =LI"L=6d$0TI2@:W;c춇-_U,3|jMeE-VMIDʬȊdib5!dd @ͣ0eJ8!p.B##n%i2%r3>yndI edRڧM gE1ѫdgȥ.:Iɡv+EboIQ0,%IsF'Q$Y\ 'ͅw(1IJvv+tjme{hG0aû_6 sG?֌~{F?IMޜLG4Zd0s'_/8EF!Ҳ4ob!W(0QgԇPJ̧y8n۩,ծ?_?}fv<u}VzI޵6q\ٿ—*_bm-'Q9(RSbL *V\{zAC(&cWY`}t_tY1L50pŏ'rYRϵ*_VmԮq;n+'GO^jho,O[}:Y\.>&߼z9_vǮNp͓C <A ~|>xR[ְ/yxQ<Р`$K3h*;A^R_<)9X^<=z5[TMH)CCT,KMzհw%dh D%rͻ~#Fe`.i2Pd *8$]U9&۽̘:ksf5mGH0\h ^GYt9H>BsF}!{^N<Sk%8\U,7<6ӖSMLʃI;>>և„vy`IA띬H$ddFANDy(_o 4,kv@4f8 |`?>b̌ ݯ!3~nYyd#Xq NFf4f#ۛ)an"vo.&.u&nn 3.jyx2զ^`VB]o O9^.fgʰV[76ij;6o:/+!MUUxKs㇟,$_GvKGn- LZv&_e.:cOO).<|%8{}^9_Jv#ԆeM]mLy#H@?!{i̡k4.3}z/o8|0;Y֒-{Dx#sF"ȉN4jكLpU^D60vzc:ؼv~QچuvGRSZ-OKM_i>׃)ĽZEYzϲ?NNOԟj96@.J_Ӯͤnًqn72괥ܟ6|\.Ӛ])N컓>kџxtœe~w`vgBh6Xµ]3[ܵÒQ8=F3+Kvp2aRUw]ntTH$WRޭj8H݉m; Y)1Jfh<̑x~bp F_ߠؒ2yœ\GZyGP֚)l'_Z6>$C-ڧgz4ټ99K(G/:9ʶɰ༒"+kǃgJR}ea*-66K\0*>J|bmB e|I$E1q*%Cqۧl g+a cJdra l.w8![+/|a^i),!BV:'iWG/|b BjSJ HJeCNeN>3VCLµ$ݧ/'B 3!47cndھw)l]{M2^C{<Ùɑ]I9 '.xH`iC#M}$ o <S!0D6CG 1iS)֔>ՓV\{X˸3N "ܣu4s̋S!3k\D%157H. ^iğ c"v^x ˡGQ#ϱR=O6+YVF+/֨=DO}%C*Юw:;;,T#QQE=/= R"}:PK%SC tj()B"Q $KH :eBdczEE]%I9&ZP$"/dO5" $K64 J!Rh*o7 eX8a1:3[lj4ȀgZm _-]k1/YbΘr(N2FeSlJpD9!Ⱦw l'Z˺Y5} "\"uzDQĖw!c9MjcCgTԟ!҇-PdJGESVSe00V8&XbQLʀ|Ѡ‚BzS(!d%BQ`$ (g8Lt_X2Ѱyd  Z@2VţF@܂ʠmMƥ3"Y ԏ/X?Vd_E)ImL1xY'ad +}pޛqgQ'cf^"`AD_b=Vyx@P1@2.G%Ȣ x  BL)ڽCX|*mt{0Z2z*vLp  !:U(d&>a6ԠQyNM yp ӂ$R'e#SpqVإFuM3gՉҠ ` &DlAo4i%8\N!,0k/3wPB@Z:]К %Ee'tCX^gH,ԙ+̀5, yQ6H꫷2^p`F^a- Jw3 abqȀ$>/ OVZ%(A0V_ B+CvZo=󼖛 t8iO]O}M{q2;뜫+3.O @]#I`tdmT<kwGQ۞ŠPfauZeGQߵz(у{UF7jQN=4跆r=9T%n2Ԟ0;]9Jޅj>=CsmA*}VwQzBJ0 B' R.Ѓ!U2j6Q!+$OYQW2Υ <9 @{!Btך/O0+C}C!DEAHI!dE*F#0T3PuHa-\cGT "&Vp `NiTh)#"70Gj Y*H>Z.r}gMl(f0T( Z?!] ׬u~9([R։: /TR %q$H L)P7hĴH$-f@(Dҋ^O0%7a@F,S@k2zJ&`YK@%D`_XUzPJBtuUD|Q3Hhn $ Hߟ@EK#Z 4r*Wẚ dAFN[͘#ՀY. _Z0-6jQ&"kKL\M,了jYg͛:K\a$d뗹jٮͮ]cÚ_7_NC B`vAK=@q'@Z\t=n1/[Vikx2.{4#/yϬ[ š m3"CrW'{_5^2l~M² ^=2nmXB'Ȓ<(w^΂}8Lu^B432J"ڏ^*vJn6[);]}m7ͺC?O$韻>T t/<.ﭻ} vb?|3|fq3߸o7n7|fq3߸o7n7|fq3߸o7n7|fq3߸o7n7|fcHBv~? w{Nr@Y7OD*|VW5|ZegSU]U]W|W|W|W|W|W|/<+0v`>0Vi`3ڗnE 0-s Z ~8o'^+NA&nNqUឹ0$o@إ YAY ~<<}$~ c\\6כpjGC`3t&~3/QA\bDda.n]b-X)$Yh`U1 l柮F}5}\{Cg# \Tdifodx̰WBpڼwG`~Ϟvx5N?`Bw&vZ:w>Nڢ!1M]7up@@2ߦ/!r7"]B+`egT3ˮfw?ofwAeq?SQR@6ƙe^edH"!X )K(EDOAE-6.0jFEQ7k^}99Yp xA8{%07̞ͼ66Sn.S/[Cu|=-a1/D{{c[?Ⱦ}L7{%KK!̀F'!gY9c8$1!f5׎[/FZ-c췌|bdk Ea[(-|T[>d!.b1[nb8Lݓ?AWYWl)m1'ǃ!h65(b,XFKbFZ(/&Zlnx0 qf+*/f6 163#*vqЦ.6q-<]L:ڪՖjWvć ~»# ,_6A0K]FR2LDbq82v=,6 ꇷi4DŽd,b)8"QUX-.'jB$eA4ɔL$EB@YD̃ G=֊K9(z%-b췈.jvq̕rŤXh ESb$^ ,8$%C[,e܃D.)paBVvTa18g-¶5!nȽg ~s9ƳU0#oo-V*TKoD/;\|jmhk+־σnZWʃ^y+zA<Wʃ^y+zA<Wʃ^y+zA<Wʃ^y+zA<Wʃ^y+zA<Wʃ^y+zA 7~2qKt~m.V[\Lڋ/wf4(& HsZT*<3Í&[SRhLJ\传~%^z]󭤹W(3jǚOzv v{#pZM!?k}׉ތYZaqK?U9ψu a]j\Xp:RYY^} ,ŢXTbQY,*Ee,ŢXTbQY,*Ee,ŢXTbQY,*Ee,ŢXTbQY,*Ee,ŢXTbQY,*Ee,ŢXTbQY,^ŧ/mw`Zq !8Z@ \mnIXndaJ7"'8oD U`?wxTL5;1 JqG;TQ{5N jh"MDs"sGQgXA>`ņ86dPKkrZ>/ #w֔&GK_ky05(NE{@RⲡC(/@Ԃ'+@֒2!B6a p!l7_@Z)H( T- S`UDUpi76;!BSb|v7[ä͒ճpԳVY+Ë] .T( a8X^iT: !rRqqJtU*pP`X^l][mvXJL$`^lF *IǨN*fFjsI5NZ*U3+fְF%$^)rrF=Rm}Lg^ G1NFKnL?pL(3|u|f_^oI9=Ci IDIJSGc:K)mdV.#cRDǧ|DXㅕϕ $E-*&>, e?W)Ųp[b8 lO:| S݋1vHgIq\%/9@u*FH#P4YDc@V챽XSi)9egɁ d6u{XuEnuOƨdumI\J9G I>hOSI:JX .6mWtƆ3Lt+iUSs* CzAZx{3 " CMVrܱgbx>;LWq?ǨfҞNo.F.w, 51S#m`x6 #Yg;fۍq>h۟kl~mo>-ѥᠷ_D25ѡ 'm3[߂y$|]h嘲a-XgA7~[&]v(A|:FͯLI34}")љ/F[:6%sKͺv }Yӝl'ӒŠnou ._Cʼ&E5ݟ6|ۛ^Y^ӥ)] WΡ m8 v77 > ʭǻݻ4g?~l  ̥fwM'-nT-pG{ V|̃};lbe%g_lh_l85m2ot4ߦ/?X ?wl,Es[݂,l޳[zP)d6^bH.׽иkh|XfaV؎:`Np=WTO xCl,H]>)/&@kwc!l|2JHpdd50QYc:zOV>_d8i@dR<5D{[Sd+l=d#=a2䤄֤%~p&m1NL%TI9vh 1܄z6r{yӟut[+SYއ4hsf~5!\6LrkS]~n6x?~tl)~*qgoy.,y,lګկJxBݧXp*Yg(Zm!lC<5b(`yB^DB/ګ-wM,VpZh1E"kT/_wQ:~~]@q-NƃKiT/'L:ʓm%Z1 oUˌZ$ [S.*n*6j[@)r-5*" h G"eLlkH(RRƹ-@.SrDB,=#:s~ĜJXV1qvǰZ7}̼ 93_ Sɽ;u"X AΘ)'LĊ㜢)/WD-dN[{'M`NR u*rN2R9i&)㨎$ENPK䩰dr6r.$( D ޲o`332*{Ҽ9Gop~3ϰzo>*|!)8ļh{G>Q&e`gu"V8EҁJx}`nGog[iW6Ayp-lrDq ӆp8d[uw$P5=1DDc}E8 _cT%1';t.&5wGۭ 9<&@ˆGiYqU>g׸Gug&짼3H/ɲevuϚ.=wʆW6,DFYjL >yCY=*Rq+ihв+;NxQI8 KĘB1 m 1&?x GB  (9_ HL"H\:-L1zyt'Z^!bsVvFƨAF-I'pT F,ن_4C7 ϛ6~2Ei}WM;i~^|޴~˟hU󡡄KW?k>s|eCI|Sݏvͻ˒{׼.~_pIMT.#T:f=y]2T\ǾWqp-Ç"᷏xjNд؝}ܼ:2]B6]!f%v~첡/.5kby#Y>`xӛ:dL?śA gxͤ^WS&>f|tq2g3|yt߀hascv!M Za;Cn djGFafݣ<ƻ5KN#Ԡq˰P !tpIg2bryp㯃z:8O&->N2o&$ܡŮ_q/?69%K>8=p4xad-<_?;^Zr7 ӻ@aWud:Ph׃/B7IY6q&YY1zz[ܞ컎p0P<+PMtjUqɢõχٯ+^eUt\*~`:;JB#A_^^0wyt/gk*S\AR4dGS A7)3 b-ѡ.K &y. Z{1wzq]`"cMomi ?g VUPS x0423(0(W9i222j[IOQ$0&U- /ڕ 8fxJut,  &Wx7:$b$ ܩ@%zMvrձ1J9@TK\Ly<8"\зF-coU׾r^^ʛzݽHqݽra6PhWO8G(L Zan(h=Gn@WNEaһ`4IeA!4 j@0d `,ciVCkkZx[)朋ČWD\CF0FLEM%* yFEmoJZ `+=BcHA Br&Q xupɨ RL*ȩ&weJ&,wnx2$8 XFDu8-SF 'UY,5c O9&c@X&K(UDk@$U^ZhP)&U+\¿"/oo3t*M!!l7#? iꥄMpG~NZag,+N mi rZ !TV)}ܻ;3J.Qu"uHQCFeJHd>l,P@/E-~ڠ ln4k-W:o/;Ĝ}]>csKQZ'(Dpk '6ikt*M$BI. aHU"ʻDooAwp4CnۧfN];[({V7xyЬIrE(1-ZA4 kqt|&(OCyZH^T2(ws%!d4BDI$ Dt^{m&$訝ei}U(rUנ@*S&rYB,˥^T;)9YJWezJ2eDJ%l$|B~J2n EQ pqyGu ᓵQiOPޭ͂gwm,O[ܻ6Ӝ99Eќ= >m+W'n~%++2mi,wfp1tKb[4@WI,DMV;L@jc\3R >F 0iE̒d s#CgI0+Y $QR.rb /,gQZW f-N1*^>LPI4c *+]rb𢧒E݋@IZw_MNd\m6JJ$`_O WJ˨>iAjZ~:Nz&5,Mq) JlxEoR &ABh#;m|_^qQ8]Jb䀬8>X*85C^ Cm(V`#P\G#oE7p[%Xܸ[kn=J$AMBT0i!(mW_iӾf!8BGe rHd缱\:N)EYO.)n-3)) D1("o19.8&5n56hdy )˰#`ZxJ+.hDlGe-ϕyqy(Q$:QNj I.# 'dO#pIL3(WȢ5V٪[xU]V,f gӢV|y<=8^f-4t894`Bh_3\+p k(鬒'Zͳ{ ?nͯm/|7-xn#qg~4i88=,73 AFk~Mclal^ BF2s$!k9 ì;Qp,| =sp>ny8y稌lu]v>⚓&g\JHX"i4W'j4?yW@تJߴri&|?j) Z0KQf&TayKs46DEwL[U!}f~3㻓7&|?~woޝP'p- i9Yl F`5_@/54ah`%z1K>rǸ1"Rո<*A1]ݷ0 c%>N،oN"T/,&4FPcP1GJF m=x-v8,8Y4x+bs7D-≨hŜV#󣑩Vme:Eu*O|%<0vU} TH+Ws6-X2TԎ(-.}jImI: cԖ:?c! W>/_`7XN@Ev 5\RJYKD5 IB*m$ԉJ&351'(w[ h"!(RNmrH$N#ͅQp1rv)+WbYkG0_J-թ~ |j}n֦;L.I5h6}2Uo<šay?V)28_.N6cwꐨ5hj+;~}=]n *~.Mj1-\,vY7!OM&CyC#{]>/cu5gufqbHVeb{oq<jצ%c% 5/ x)GM`38BPgaW>v[r4Yjюb wp:N_MӎDR1hT,PTr߀_~HLT:$sRxЂ!c:,xp*(}^p2̽':e1soQAhiPQ&-q($qUN5_XPCzqf18; b|W&*yxͪ:~YH+RR Jz(ЮbIY09D,dGB0*@BrhH)Cqx %R(.%<}7cRwDvxs&Šֱ>qxǍ,.oӆGBGJ"hD52P@'L-8˖4V&` 𕀲*#Ri]B/Aij>=ύ$Kkt x~ kfLs/ykƩz5&iQ«0fEi4Ulk-L%I__4IHXF"WC=:믙7 Ed ȪLUdAY4u{#"( ziV)Oy2֣EjbXŝ *:Wv<L DL &_ Ŭ`mN6S#gǃ;cgQ3G Ъ*_| Ǹ~~Nc油Xysιډ΁@Sd\4Q՚2'}5)a@2"2rLBLH<BD #Cs&Q8S2͕}AzVXsP"}œc SAW-eY#E(D8[DB-ME-1A_GPIxbj'99GIAIݱ-[ 2K] )Pl=CGzj% Jhi z.}R+D!3A2"$G y s o@<"(jy(x6c{^3?b#~TUw$)zwC3tgۍosUG/RدȎz-J źפ'H>;e"W.K9Xr[L:aޗzJCU֟.vRa7_$hT)Q3Owe$G$ FQy2]x쇁w6ErX:Z67XE͢RN!̊JfD~E++*xP`R0yK&G!P)QX*K5I* IDXgт2``|2|0Y\rr c7|7na=*{}ujSVOW.' %I3f#f4DiT4GE-E6j7D$52B1nøo5\*|(E@C &gYF,WJ=1k߭~7ێ4o)VgYg{MV4ţs<˨Բ!*)4Qf2)Ĩ5p:4ha#H34J@aD쬠0wk#H$,A\:GVI=T w1fjP"v稊a,b^P=ލ5z߶C_FYNQO'ņ<-x<K5m/^uܥ̯~% X {ގ_]2ǓgnNVqH^͞`Sa~f~Ԭd>C%Vꮡ3@5⹎r,|N<72㦗A淫Ws|̀Kz1_DP=[3)jtYdesczٺx0hzn%ǦɅ&œ4>t6H18zݣL QOT1`2Ą25.L$0TTm]K7WW]T7eNP_:$܁17l>fI17#+8vz4 ?w%z0x'-ٔ2Vc]d͠lBcezPeh@蜤\ =ό C5r13x-:'{Lc9Ϭcʝ)/C^m*bѰސ?̵l_Y?;^˂᥏Uf:1jC+FbCcC+@7jGO਽[ ͷR9_ZwK HLD!<%FHȎnQF$f(<b< rPC>)gNϝV;. "72$.`OM8*)R& \9T'(2(>_ 7cN]\ZFl'DEG* . hJ%k1$xVKʽU}N[ADU.&$aas<w:Čs88; [o woº)eo ȗݫmQ߅vsT`C5SV+#Vⲫ7|eWՋ\s/wA+і'$_8%*)PwJh ښx8Vtxkrj)"0FI*^ec)n{ U(ku{阷RPX4R:ILXZ{UpIˁd0qTA54+ KfM= 8-`Ф4,%`ZH'.GÛ1Lf%.1̣Rh$zǔvV$^BI)&U]`A_>h] ^589P 6A*޶T L ̽R`N͋q=O|&7lvBr9Z.CUFZqCeך)0[Qu \T_ž{y1&BӳMϓyy=Yp;=8#*ĭKiT4ł 1!eLXό'!\f5x,<Ş8b>QGT &V >jgvuH{(k3W"޼JO zU>oLa@?]!R2R@TŝU Paj_N,JhE$ztqC &OQ&6Hg"c\ЂA`G ʶ|[-oCp&|:SMV5(`Pr<[k*6I" e%Mvp`~vڦKi $%^S1S,e1/0I<lfKYk͢tH;Nګjn{Bb0ww5KY76nbٷV轲  2M|)ˎU\j8ch0"Es0 A+kq*Jl&T s3N2(aI LrnհшIW1@ř0bR Q),sD9"x d9~8`PW,f 9[қ?^U]Ol;nKW[jv<*ZNB߸{ɉ4>EytZO"$ <p4 Sdڀy,-X`@L1<<kzykG Vs@"7hq$GNI IcQI$Ss̊dn4'}f8^r|ZϣK/^@hkJ|`lqh`Z'>=:+8d0P_&ya/r 4 6 Ps޸ނű+*YbDwQv*s@3˜"i13CSUbU}ܑA3ejd0Br!b}10%2S{oSPQZ/Q[O'qvU͏&4_^e}l]SOC'IDc8(N疌* 'guWZKTj?-.Z?ҠB-' (W v"b0Hg! ܶ0Mzj"-N{CpMN2椲72H&OWt7ob}72sSs_*>RZr${;7}w}]2U,]e57+i9%zH{|eW8/yJwjpݚzM!풸7iiDM3yO[pvc*^f6laxbr>L"Q>h|lV?6/X,'75g _X$?Tn2{myΠMMy3HWDM٥ڻ-7\M_ , x:aF_"ގ^:Ȟ=淳{]AZ;1.1| q6UgKxt=E# unL/9[7_To\ܭw@f_5#LS]Ox7tLmt -NL(PDCEutӻq}8NM^~SaQF5` -OYp^Zpl?6cs3"Ds0,6"lVIW_snx'-ŇDža;ɚAi=(ل%ʨ )9I_q?afYYn1:FEdu 0| 2u,wm$G_!K.8jlndȮaTd%RKRjTeREKcNtWUՇ*OR6UYTJwVn_U>d=ӫֽUcˎ@ϣRGjkYWJ>b ~e ![Bۣ򛇨ܲѻlM)*4h)Su41qVZ)KD|I{s`5mREeTJ$m}40hv!eN2L˒2~0팜[|pfꦠ@IAPdVgHRZ4Yg#^]6J ClKx=M`I ;-% dS3-O"s e/el $v)BvFO1x@`)ȞE6H 9dWaӛYvW3?]V<8W?`U-8 ֿ͓9Mmkh o[Ue?19Ui'&W0mP@| ftuSC'Tc<b<ɳU۝:qFkCO4;1QGmdշ]X{(GYwٮ]7@Wt򍁮ƶGZO>hHkmQ7~|kbDFnP6&ZQR(?eoS3@,{uZ*%6*erOA %eˠ u]o&,:q2tX Ğ[sZ)+,^|sMWlkwoG'4j2IY̲dBAi)$ o'»@oAvgx6-_VtnP{7`7g|yМz5q$gQ|/BcM Eޣy@xX'ISwy]浝&bx 7?1g0/OWFhT&-1\tr^E*ʄ:KL@bߖl}5 =C =WĹͨjhЏh{ n.8y A J,I^T FD'rFG >ެIqv>pz;^؟̮y0m< 㐭\.LIXy q6ID ."C\M`7l}ctEz|qȵ +uR`OZߨʺ֒V:VF,K tKRCQ RUI"v" 2ː>J̞&!26@[%r]:uwFm(֭uΗ&~ci?VP:K~~1ړ_xȽhuu'«^zm:ǫ7nݜ(7&ɘذb]SL\]L ?96AI(FNɩTcD9e.Z`/Ins3$qqگ*g)kTȅ}b1dtx!`bBT*b֥=*՝BՖAYqAbFZsQF`yݱuFΎr@ZFƨ|`ڈ,IaS U 'JEQZ]/] E~bYMŃy#V (  A%%]()M"P3gIHEV+G_Jh1U|1c-Y{8Q<8?ġ5&x_8u6H % P:eO&dRX"% PAϲ""L$7u?zz(+|c[ EBr.5axcDN׎X(D17}ʷG#؋pJGHr; h5EjV/狑0*co4Ttxie-˲^ν%'I۠2)R%DXL"-ވgQjLΦ*&4ONkHQ+fB(g+ % F}\ V p>pWm'$*Ețu<Խ<9Yhͨ/Hv z^b_)JJ("du P  *i|YE a^B2) u+H6%@Z0@ } FS{:$Ȓc7;F`-whaW~0+_XCi`0#bxAԻŨ骱<?׆O\$.wYDŏ*sja:y翾 6,QZyMia nܬ)aZsqtxMG4pzy4<;ʧY\[ j̢}F( :0o$W !܊t2xhVg Fs:/Ԫ,iy.zꪀFypQ&pM/f V/6ɿ^&$O_Ao{7X{݃5ƿa̯|^x}1up%1'j᨜O>gV aFp1kKJ@ԞџF~Hnl0j0xn:e1Ly4nu`jǣDO gzJ>%]3W4IRBI* >Gr08PjJߴ^_X^.7EXp A_iy\4Jʀh<9k}Glҭ*nyU`NA`!Ï7o߿f[^Y1bîC0yC~W2_o Z7V #`m.[pǸG}v?dag' F%>lYk8!cPx[(btFOr9'˜O}eluڈ.ǟc]jU&L^V ľI@2}Nu*3x&|i:7^&lU}Ӥ6ڊkxf]),ck1( cgShv+>NS:tVj$iOZi JJ:k F@:F=DW|UZ-#XC Sk 0 (})C0P+"%%I*fG-@=1F쌜D^z!{\(xL+-PDEщTlkFpWo!R8&ؿhk6!4 G6ez:K_<|ȧfPWt\(^qV?8?PU̅B5HF +]0V!}=ZATl~m{0wŴ,Z]w~* \ا^^O-{@S\/Wf BL`* jet6Z2K*Z"ZrPұ7˜Glu4U]ֆ}].Â+5,h}uO>b؂-Mg6%73 4\kWӥR u\͖ܗY>zVpv|5:S!y+iѤuawqQ]-@0"IOdȥ$}pkH-`TwU:U]ZZ]?2mrRm)(pu?vu7c:'WknfnGg!pe7o޻oPL7ʕ#[|x-f̛<y t͍9i:#M_o l5ٳc1RZ9VھQͲO]i yF>Y .ήߟN:^G(p] gUcixៅiM'J1N1sǯ?#)YMQ 1EĎ 3j8')Ii) ^ҬwTG-$Wm)S15~6KPIB'qJLFsȎ:8nyٟ ZZ.֝ahv ۬d@puڨ~m6Fͣ iUm`ߖTU> kMHIb6,s~L@\Tpڑ.WU%s\VZ4*@jldM8W G]FB:bb᭷9ƌξܰ}U] O.9;}ݟm>Xe0- XJ^IйBȽ2@{bj!Ԭ+aZ:FUMm] LUv}UɼTأ$ڣYǮmGFmsD#}SOb;XCŤjl%X%ϲp1!6jEɖQZ;# 2,[ꤋ"UEq z!+}YGg=IcǮFFD{D#">_kmWUjQ"1hAyPvҖ!+{*6I0Ҹ ѳއܸҭ:;RL11^tpJȤ9ݪm :ޱY_>:+9슋<2.;2l3&slbN9zvUwܲ'#U hVdx(x8u슇~d<@؈˻_kjcGfW?.h$:ñSg H.]dMqZ*VAnpeNwy^+KX+f*K)TlDn@uЉQNmu)Oͩ!hf) _ XSg+8S.Z3 "t̓m${hP=j^cւa6S,(I;:LV -f~ALȥ/U>9טw+"Co/WL=@oj8޼8]"X\>a*6nŞY_g?lw,-Mz635o>m oz߈)jx꼅M=FmѐO$}QӣEG.Η[Ud򙍓Gwlw*A%gNX hH I%MN.hT}Hl-iF~T*UKM5)[VR]$ga# Gg1럇T#3D/.O?υ?~pzC3trf׬yW["RM,NUR'o%8rPV8I̪ͪUMF45r />1oRk,GdRm=tN)p؋Gg?ȳɏw\޺xաS7߽X:6zle9lp^q,,Նz&D:*LrJ$ .Yڢ;!,….,bIW[CRGu>bKh*:54T!N d5'Ej6Ge+K[YuЄo;%y~1ŁYm%񅱙m=f3[S8;`35 9j%V솬4j0JB7/ 1bV& qxVi̱ſZfyLG;J`GuJ&e-#1(zhW/Omw7 lF ۍBv"}o`#td6JsЭԹ[nYo}dp%{qtKY/>ۚӽv#:<2;d`ro0F#|hx!vNT?KL,Ad(V.gouIhv+27=m|UWx%oru+ ~wYחeq_~OZO+[nLsPB(pKg%>^uyü5>[ჶK{Lۑ _;2|lâNQ;ǤV5z67sjVڮZY1hL#QP`_~^wf5mA7% xe8뮦痫]*޶V:="AƊ_OoQpρ ܺkfǝC;VGͻ 0VolZl4'3LqZとCA<9eF'vaZ0 6m?O)(sw#Dg#[GﻷE\9ZeYEi'EkLTA4+fמ輊!DG'12$EMe\1jJT6]<4(A)LC:ڦɕ*2 y 8|[T"JS;0!jPZjǔ)&;^`G -%Q Ukuh瀹젴Zk֐OP![՘U#@dit2ő7ER(BT0- sl`KDƪfpj5G0:(¾.;>C5-aIm?볾D% Yf>*!daaӮUj&aM*: JX* o25"8S):Z ,lMP\j3f!V v{f`xiG.?YH;Id4FZ`#<%} r-w'%rX7Ad$(U9P3FDA(IAfh[TYTkrCu7JzQΛ16&Bk-R"kr "樵)6e@?I+q14.LCR [ ^i]I xn]tV$lȃ PL3+ !D!: X{()xu (+Jx-KV^ SXrE #X/`t[\ ~̅28(BQ̀D)P,ـV6 hLy2Uq ߪB{␰pĞc 8qm"\c=|(E Hظ\s( +BPA4$,ZαS؊ynAV6.0WK H vۄ4Ѡ?iS1x{PJJ2z]|H2b3..)fCf4 aH j Ȃ@ mb;DWRD&)dDgNUE1*> : #KQr$*R-!abd bXX[Y0q#XEdҾJ "lKVJ!(Ƚ)sBF0J8A1nPXd50G, Rl il?RDB6}ve#Bs~S5e % ˒ w` G/S* H:-dJ1 ZY L-owaXy9x"?\FM&23:טAl^YxU*y' 2JQ;H M>&Q`]:n'Ư\p 9?%MeԂ'A@q@82,}p̭*X{.[VASc2Rv1 k/UXva-!=(9PiDLH22!/C4qTT E:?,,9h\h 0}QPVAkթM<oŭ |ܨLt#+Q'VVwFR8-&sJ޹q\i dod/ZCv"F`nM4aI"}p(xl${OW9z ChƊdV/^`tTX]Ru("B'Ѝ.-Ũy記M 瘝{P1BQE^IPo&#)c,|^BGmhv)0VcF@?!z5xrI'#6QtO,%٢rF'8RD tdM1viEo#tF 39($ ~ȃ!2C #RFdCQes,0́vk$c}:|*i5ۜOX^GHGYTg&0QIP7o5RRjSoĽEX iQI6 "}AuK$0Չ!Kֵk:8lynuѩ6 M?MM󲍹f&AY"Ak*n!t #Ihd*ޢ\5 0l#MAفuA$\kp6$灣jFCm bLy= ᧡=7Uf6c8H!/oZHs17G> -kI:%z˄ A(<"!UxJ$Kk=xPPʭ`{j2k46I#*dOW-lE"Љrr<Hb9e{:lB+x bS! !jZcx)U,J#ɈO{9/ϨcP\`JE ѳdD).XZ#eFZȱF |,J>9^_X LG4ՔqO2ˇkH~~1(V8/jP'tBe_awU`i#e֏`93#ae%a:E!VQX?)FD>#KF|s=6j͆bUUpD8ڲK1w"JJ%$"0bɱ;T\NK#T/5hFNeg,"BTvԫ.~.bg7nĚsnT̶>&Ƃ,-u HN:n5l,C/H2̭կͮ;쥧meUYefV=n3P"`χ^= @Ί I "H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""Œ@`ψlHjυH]V K$K$5D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D rI a $,rC:^lH X@^HDAKE$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@/B C\ k%?xV*I$K$!H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""HC^ZobG[v_ui~y}]\^P;պIZ\/>6?GHd !DV6hmKmuMߛo5ϻacŬkf%pV.p.µc]N+$oD^C @X'j8aU?-~IcB@¿|?Xp+= vNKvJX+痳#W\ٟCPc'`ݏKǏKrWc6Ցnzr,%'5́3K@r[!Je@vbh)"AWkkcv SE}.kNi ܈лWnk#lӊvחz9Ce\.̾Z.kﶱt[N Bt{q";uG1Way v;A}unJHiSkpV7+O٭:fWϻM (vz3p~yqč;qjκsȫptWG '=}xPJ^\\Iُ41.Y%1?E!1%05\\͛"^?,>7#M.I&o`M$\IlP)Nl}tr97u?3oZ{u0V_ќ1ie['֢H/ z:6d,\U\19͜uߊ1:뇾S:$}71ƻubsްҮ=rW~'KXo&8$)`pxuJB:ZVۙVNJD.y*=Ej[^7c @`p6 Vp8m%4Q/+dk5.61F c`#̹?0cݥh];:[.8c1ۇ-f}=75]i.WxBL(n|jPD琓h#jUrHQXH{ M@T^)xˋLg0P"ֵ۔!vd[9(]Zc4NKYaH˕ZQ`y>9TҬ݉% Aw/9׫S&K9H-BK8>Ĉ ۶ $a"ڷu!(L^?_qYoJ}C8T1tQEcgvx8#I/īO]4"+_yO) 0IVD!C]/7I|~ Y%˲<&Su>`5q;Υ3|WHN~X>J=4+~y>_L?oYݤ].&\)iV_ۜ,v<+ jTjёV~e__c4e<ͻuÎ];ZYtޟT7!XE oށwzWƛ ag%D3XPF e+JrW*ϝ~=Ѳ-lO~5`O;0]uQ \ȸL}N7)ޙ/Pgso',4X8{Pmi\4j͎ )'ݳ*Wmks.ޯ.K\})ahiMjtN 7_a~w!ne'zӋǝO ]hۉ` -atӧmt _VC~ mZ? @믉%g| ƬyIs~3ToOtwلuʹ4wEtKi2otΗӫu4;^MJi&O{*h}{=0]XEjj&단),|sM `_ģθ+; ~5b yͺ6=g;=9Ni{3 -ko;KUs}7sl k&EEⅨ#qye;B[)4tS/ |~q:^#@$)Lw 9VAAۨ\\`]8³VR_0V(j LIӕO~cGk7كwѳ[\9rߒ3БuǪgE(UDx_\j3:!=7 Tfe z8n}th.>3ӟ(֯AŰQl^ }(y-x|f4ި#Dk/Vnj\SUem Vk2M$RQXYqԏ h挰2QbҚ%_X Z(U`0ҹt`jNb R[yB G;ى<8tҔ_/ LK&MzZw뤙O~^Ϳ|u.|qo'by1y3Lj6_I^LY)^Wo~3)շ>*w&Zgג&C̦zjҬG|pire;瞅s:TfqC8CSt.W\e}s4WoGw>wR,8O viYlMNI8oa'x+u"Y7Yx"nҽ+Rn|~ֆy*UpYMgD/^&OBtLO?a] ZE\ w}E~^{3[as]% &pt޽R t&ukJ*;1Əщn> ]WQwL|>);oS1%D-ٯt8.f]$uccm,.!gok)P7N0L*Z+ccK֐Z܎eqSR[uRo}=v[O#,4bd)Ssȍr6$fS*hVoɺ.%iVKF5  m?hj@ -[ϙ*7.Fj\lWYFȿJ7HAKB< B؎!|%t9C~ }9+i8۾_Ԭ{92JagyiE,7gZ{ʑ_bqm>d1@c6l70=XLf0:$ίö"YLrr$u)nUT<M@d ek;b#.tlU3YJs>FȡsnQ{WshM;r)|2 ୍a1@ "Z!< goIn_MWf9}|[So0*?>OޔʕuZ4p6Mrk軫W?{7~> 7'n6 =XUDO7&0m)P7@EmQkYf]R=BağTc"iIC%0'F w^4,ӿ^ uc;t-كg}4i>c9Ђ%k$G}adbJe% c:Oxzu)%֝U+ oҝs=9X&—a`hHh"A'>ԌId ҴqKz fH)=! BcC@Fi -|#JbśVS}0?xŶCZO)CZ^O/dYY}LcC>lќ C?%bcc4Ahr X!TҪ=n340CrJz -rPw;.=P&G+<"Xxc4ɥ ?vh">|Q7P2d$Jfh Hd}%+#딷۷p:͌2i,x\ߵ1?bo7km<C@ur@y &t>6.VW$Ɲo8e}w:ȬGs$xPwIBa۪4{럯n> o`^. /@xDߐ]2+A̅He L{0{fS}3;sAsYZvs9Ca 979D+1[ %r#-yt٧@M c 9^oI8^LFffoy0m< zŅFU3B}I=҄{ ozV{3|1˳pzsزxڻhp2\__yFMeW/~ɿ]/~\˧x]3\ 05l4^7Ը&<L /8oB;=ُ\§ڳ/7%ARGc .hc97 &-9 KFp0ꉻSë{%\ó/YNo:q+<cIxy5N+ W6J$]\Bas2l! &/wU6kN|l\+w~dKOһKkduߕB,p!V"4¥l4:j@NSb@8Ntkwdzرxl{ 4L: &fJ(R`2%:6` cJB]F;M(ydD5&) e̊&,'Frf՝c"TUH&i}g <";ۥr.cIԂg(%mܙ8hK*OY246G}T=I3ޣGI ւS@+BĜcHIQKB٬dU=s,;iJĒd"e.Ad0cJI2gee=֝ի/߀::KP$th&dH뎌3BHbc ƚ0:%!jT78HiEoH( a':l%0E Kۿʨ8 cb#7Rl4ҁ.x@y~=:Е]@!q":i}$m^Z+]`x 7LjS|ִwz r{J:䜓sK"&$i^ft\< #K_]"Ji)o:})q"[BE&.|Q޿g~:ذxF]2i`lWd𶷒D~&ꂾ q;93N 'n2I~pu2xy9}?MpugM,afV뙽 !Ar9m^wG`lSKnl 75#66CfnRØF Y,+h8ZLyzด[VƱג4)IBCJÊϣq(U_Nz^.XV~XڛZ)~rԟrj)ƂKڒ/KQaWz4GQ?ylk5w Uc6pIIy՟~z]}?O^~ |^{KZ @eB VwoWM_vmZMS{%M9oovҮfJp[ 1_]ݷ0q5f? K]s?qe8ILZ$' xHQKBHew~co,4Wo(#%4<V%LYcHԚ^+x!oHdSe_税Vo>;ʠˁ5;4Gw pb_Cs6:#OUZ/,6mrEK`zS.:/?H{ 9%dvQ&! rJXV<;\ɠ *"0Ƀ U&/ =rRyo9b&s#w;+Q\uUOr$/-ND3FjSqD]Z;ǐ% YEƩA5r;ͼPvךU~`_ðrmeI[hy]i)KQb*i` Ӳ8e/^^]LDH^hbB d+1x@%ڐ>=}b86__ $ͧe15b(H&s`{/[zmj} {-Mg1eGZ'3?RͲF,7nh;(,sr% I!~?`؂VsNWmDsC,n/wjyfs}%7,l.[wp)7np_Mؕ!ɍ"!dX 8[!vήMF \]u 7Ɑ n_J7wWO9ltYfm;gJ6TK5_|]x{;|0oXwy2Ȼ[*'YRzQ߱Ew@< 2 vs\y 'g"mOi#'G\O7hYI#>\!'L4!yy6ota pi] !vD)Q4AGpS6΃͆{ͬ倧LYX1.Gn}>D$lMϺŏ=IkW+h@`zig8x]R!AiHe #="魪 %4 ېTFI?+^~ɉ3$"pU0 Zwvn9,Mո1&'{J~bY |V ԪZg+R!H^4:Y;h KB8F&4d4~HꁦH1fOޏr`̒ R021[/KsgR\KǹHնպ2*la5Be[-<-3,x $L͓?+и`iПLqN˝99n3ً`'8|HFZ-)(Ɂ!(34gEBb66A`h&of3sH-vl~47\cդc_+[mY)!N|I!|uA"3>*#|fì)4wy2/A&C&&H&ʭg|{+SZ;a/rx4|kMe;Y;K"7VYd)baI2 'P2m) ']-rH7N d 4\F";H'͍LwI\r^C )E.ݯ"eeYmAbY]U* 4ykw:#qxs`-Ų\:$_gg\^TE^/nxɸD6f QjzƎ$t1FQ"z9θcW};ևz7} */M9n _)Q+qяVs_MoOq&:EF Q*fK'w|xǝ"x>q%Z@mJPW=,d U cJ#68R$#Z{1M>c:+rja6IZ w/|,zMP\Π`+zvzls+#φ`(ǣЀ:6}T;ss-Ǹ'qJ15 P9s˭#鸈XTAI.ӑ~("*ͫfV~t~<[L_` g={!m<. IUp;oB!1qV{tHba)IIohOMu uy2l>SWZކ,e7R=A~&}>?Oh9Mюrxw3\kK(eXNۡXT>_ek!3=WwsPp-Xzu6gG ^=mozRgHmHjrv:1׵E )qX`z(ʙ4<\ ^q{'>\|@1=\-"xO#X"! !Fs*@9ˍÁ`nl/ s0E;u0 Qd<)VxSjʈhA #gh#2&"ݱ3r_3!( ;d:x8j;Zu9żRɗt*g/  :-0qZL@SSE$6 pRS vh5Zk(3*h2:ڐr;gB[g<.kq!vRV%n싕>H6<4ݮ-.-KeI) G:`Ng)^3ElU%70m"GmV9HӤ}FRJACX_ g^}ɹ`1X>NFqqiBg}``T` q# #j3dmQUt6O]K%(0U#6JvK?졳љ͝Tcnڡѐ Hz{$Q;۳ZJ]HpmsF ًO.y埽&n/o+YGH(򹖜N)0D&Yj/.@.VZ(.#+ǜ^gw1jBbewhVy0LqhN~KW%,Tk<ʖ!gG΁5VŸ[Ba J:Bry3:WmLYs}|/t< PKXW8!x<ɾǏivR:Г=wp=:=fG9vI| xx"KDxFi1AkY'4Wj/F2%{LZ;)wGYhQx>;;aBw'EǣQ}~GQ}pOt?q({GQ}~GQ}[zB#>j>j>j޲ٲ-ņteYg5$|GFDc3Un-LZ)a[[j=pJF F{X^ƀ4LKJBPVJ'ݖy +&x9OZGEdQrt9lUEb$Ih$]튜DZ'h|ECܙl6y}i)ʹ/?nbZ/ ln݃?)ؤ6+one1pDXo@WÀ}/Ic Dw.Btz'-}@h DvPIFy޸7ZƟnoZ}".hYҿBZuNٸ8~T4QHƃ ?O#+Tt|}R0||2,xzI4۷UHEw10*eD߽bhra=@s t.UAM\4%a=|:i^"8|A+Snc#p<`EÅ)hB5 BkAs+8OYpO4M=ͨ;A[s@CiIAe/ۼHi{MJ=٢{غ RyzV4v꼕%|S . Fv׭][5f/~kܛZcNaϛBk-iPk| BǓtgyñ<]ݐ~KRޖpIWpV>3Mi&t$rj._ͺwhVF03anE Tz 턷1r&HnxQc/x2tX2]x zoȮސz暭4O6o7laESV~ 7'xͼl Ԛ…*ٿSŸϧ0x5*όj8Wԩ1dt|sQ1X? JJ;Aj"xuf lpUiG"jѣ|}Շ Zdl(;&!48SҎn ,A~$\#rAKE9;P=z=l>օ>&J1ъ u?nKVRsOz\t+)YW-]bʪI~1)IBRŤK_MhT:qsTIX0:VGI&`hNu*@POM:B,~Pn`JِVL/YO;֎_2O:KOBg/= x9a~=ZtєHz+0 Yc M$ %CNT,XnnrE)9&y$ViǬQFY &Ā78ze戇 `N PR`x#q `g`CT4@`>JCЊ;,"Ec1`ӂ/q(|7*$n8Vqـ q`UD#dCr+o"@Q:,*w1r/c$cYePyHhIxe3I*eLE M{{*k rY^JzvӽW:H+%wu5FU=gs/qRS">0)QHK/`?4 :ݽrazBGzl25!ixƘ5y^ww^^hL ґ&tLz8!]K BY4IxrJ ڸ<\Ϧ=<k+t;T䚟fg*lQ!BT6S^SIw@ T,xJQ4ѦP;Vκs_9Tj+ U}t!D؊6,XK Η^YCYzrZP3qIژJ} RqRɄ%U!ÿN?7l7t(:!!\=NKu鉹g"91z)iO<8[=j*g @f>=h:1"5h")O9wIv{gZR^Noayy|V],RF+09ᲐM-(- J:ElY$g a|DGz:m7-k۬~GC&uO/x0ru]-8 ̓s|}۫_?'_[EFr(K[\Ōlm/JL/OLaZŋ6,,=z&HHhj&r<Y5BЀƅ&1Eޢy,xXwYB_4uUZ_/n~< @NTߐ\.:YAaʡ'%%Q@{4sfC}?Y\R6]?R|Rp3խA؅'G!h@/m> )DSA(Xb3ұhh_BNtD~)u!Ǜg..A.r|&gA,M7aɄA:7an$99qn,dy^ƿ¬cUy%kpͱ JFqboI|xZEuHs7s!EI4DIeX[vD8|@px+ CTnuQWSDuy.888oANUrD29e.ZPinss#48Og)kxr<8ИK^11*JI1gcS ԎqN̋⵭j$A«E*@P-!wc=;;YW}j 6Aғ9 ȐEa"dapXyjZbFe۫@*{D<Cu%}6R_l,A+i#!M,t&CΠVz;^H{/Z^ W$IB_&C= ` &ϑ'n&YJdb?ߍwJUIdj H%:ȯ=A^bT,;i{mᾝOsSEE:}Ow|d`+.v:De!R1%H B1 cWZbư|,^=S5k`+ W;cH8 Ǜ+"M[Y 6r3Դ@zggr6q!ݫ?J -$)kS7 rj҃N>r;v=ґ*PncQIzH%]H sGaW/[¨aVӔS!u^Ϧ}C B''[^j܋(1,>!D ZJ-r)]BA}=xdsߜFy)Ư:lD)*6y6DBx(ȲQ#ѹ"4qO}vʺvJ ,#Pg)'# S bL/"}CK^ϋc!+T>)Hx@VUtZ`%rJ(3{HT}8?cmZ0ty"Qk `㾲~j[Y=f3秌ّkͰLM$?_s #/p'|8J AY~p>g*pɗ@KfKYOS-Tavc7} ߒ +-RXS۽?}=ei77VgƋo$6*-#R ut&N,㑟i]< kV? E,J,fg06UBnaɇ*Gyp)Iz446lX|8Zg2L~Ƿ}>2ϱ&:g]˃::M]&UӚ`Iu]#oi7Eavijb ضc݅} oHwfn'.NKҸ M8B*ᄌ9V$,E>RBo'=5ᰚ\ʧ]_as JٚtŲ𳑉ȯ`LltsPw΋wӹag]wuFa1wyLIfрi&(GM"٥`8L>R;Pܑ#D0 O:k(:'{;WѐW\8a0;ňlAF9:CpZ)9[.Il~;ƈu="_k A;"& %nH">JD$$+F'Q7m!V/bKp\ޝ1 k /ϓ!/:usi6lW x|Qℏ~^?R.5X Z i m7^p(KRGk-,J^=xvG2wVu4 d%^mM͗﮼<ߡ+%odrK}sϼ9/nei6'ea-_u{cKylyʘ_1;ټwgL31m+ȴ~.gx֨O[退SEea쉭<**4e$zpRJ!$CQCp1$QJ[7Σ+V@mRE DJPP)E]OAno)ӳ[__v|7EF!H fЖSVBⳉ"h:cTJoAX+Dրb64nd-eVD 1;VκC얓i)<;EgruTpu}@vGݟ|Q :HOBkz ZkJyA %5( d@RlI~ERRHM|NT xJf{HWiE+ hHhv@,lTgbp~}Gc!vWիw^; ƜJ*f6՚1Vɦ qƺeօl o]%dm}zFb7v{>C. -q`sAii-ebyRE#2$gx6N`b^e) i]IYcgjm=?bjmYk#إ"!M5uE# A03[0>7Q YaB2w3adJT;$$ǐU:H.kևSMqg4b.XW#Q5^#.͗TXHȐDSȧ1$PĀN[ ĴfA8 @!j8pFS BĤgNA`Irԙ^OnAe x ~Q#Lя))Wjo)h[Ziyɬ%x޺LetKۉw\+bn️wCQPqpӖHMy^K!~cn#[Ƣc\g8ٟ6-ݟPGGlJ5$SDi/[ mK-9+@yÃ&r%j\.1g6͘k ׂ7;%g /YUXcЖb{,Ol(SSVH >Q~U--k:/}X⃣k,# ]F3UC%*d{Fn3PT^ YcղK3Ha^}H\ar0(?e߁wmVob<"R$YgN]ʲaK^nr([6x-Bhr U) z/!ͩ5Q߇B|WϜ54Zw* t n⺇(Dk柄pJ`NB&\K?Ȩ08A[ppGa( <Ąo *VKn '|xbmGkϦ* n 8L>C Lfܧ ]oZ[^̴2t5$ynReMV|;L|d0g0xߟ1`XFVXu T ?t./q@LZ>.&Q sЧ}37?SUy~MC:G4&;gG;]xzlr7#$~sljw˭Mv#Mv lgPBTpj) ׶hff0L+LGӶY6Kk@eOSLҲ]*K,J&T5KL1wapDaM̫_,pj=ԳbqR VKA>5&lb&O6b AP 18˜ siN:VK'R6b㔐&Iˏ$`M{s)h_ (^sq;t2ݓ'OOSOO=RTZK !u8g`|Gꈏ2)p, Z $ Q2܄9fhYoL*]w="9NPy{pWOyq20$?S RCʦUƹ)}l%>s"zF^=XG.kJ$g=I1mЇOħNҞ=1!JJ,YJPU -jqI9]N J |gל \{ k);Ԉ6zS ( o3 鼌I'te{HZZJJpp^t`PҠmPV#A&KY_$dh0' Ft"|AE4ю+8䙯i47bIt]) AmJt9E&u1Hc)R{&A.I"{9; g*aQ5ӌ-) *jEf3P"+D?xh'k.#+TT!H% M3Y鴔Jb֜-aT5dhtZc&T0/#I"{Q,\t "S>+zr@J-uQ1K6,Vs%XsjeZnI{?i"LLD0+a+/-\8+ ZWy0.wxr6Xx2,xyANb)cCVX vl0DFnT1 $Q> ˽;] XFLl,tm#8"⼠zKTb\@\:C(e$^{'0S=qȺ2KAɰ:]ܒn)[ʦ3㟼.]XѴi/a)Z SR&MXʒ f @a+"/ KM4P}Jm*Z3RU]xlVħ RX55I΂QRh1G2(WU~  ͕ijzL\nO"CWM'- ܁t,F"rx$ v#P(YD( H %T*̝.At5N4GWׇ݄: xcI}ťTɘJ@GouᏧDTRQwX[y <}H+`B 30Ɗwx ~~<.<?8gw'ÁS*_ϝ^vkR= 0ZUR佳sӛzBF$. CafY~uduCtEgdq҅|fF]V)ٔYh$$ ́}TWÐ O#|~ݭlCU]z8%zoRxqq? ]>n^TgzIfj]/|h\U:1OYid8MN\9}uɳWOO_bNO~>9}1`_Fu]S\l<7Lпh;BkZ|8[ 9.* z0`k*O+LqX#_'lyyo$::vODUZ DA-h8]lJ13irC\v0㤉iWopI JZF$(Y&*2@u٩ljz}y6w1 !ezM+ miDK%t%#&_D1D6.kӲHܛ;XT{UO@/|(ݤvѿj6숬f#_oo+7)M ==2×o~#.g 'lu^\8)g4rWn[jOn6\=s թ}JZ7#MfeH6zԻһ]>>۹Т{w2{_ȅZTBNN?~3/ hn`߶=mi>t(0ѵA=WpA3mrHVY-fN3Ld;uc;x; f:魔GpS6΃͆{ͬ%O!g,F5 &;:pm7T S;~x׃$"i| bL/ ^gTH2HaHE #sH{ jX(\d(SL'b< ګa* j5rr6/7'XNaQOxJ#ʻc+R!$I/vhKh K!]mdS HH UFc̞h021[ѥ9KgR\ ǹ.[3*ta58TʺPYuݜ낌.t@ww]:'ѡuIwkluXTɉx(gK!ANp.I")-k$PLnB"KbLxMIF6LCAeV2 +kj~6&hjq֕`r|ȄS:8 YdG<0/ ]SU}6%͝,d3/)HtML,hHApL3s dT'jeFn}|SQGkǡTֈzԈFz^)i,a ,1J-%5ʤ']-rH6NJp\> J'WIVE9R/A묤!m<ϣ <ۏrRn845 &}i9]6ҩ^v(+U%g/z˄dij$)C{9Mv52u#!WbK#roA!͛R`dqYPGdY}s71dɜG,8Faٺ@̈́a.\} HJ``2e䲈%IH oLts@9{ܙ{Gy>xwLVsä?5d> [77GՓZsns09j  W6I{S6g#XV&pZi*Wbnd&eyfDЇHhdldKYPBP$w* q5rv u-/Q7 E:60e 3?P $޴DD}h&~vh*n^*i BjUF aJR:D2\j}Bz-DSA@u_d3`,L sb3"NR|*' R7~1q@<0#Zfgᓑ@6jQ CF""MiZH!UJl6fkKqy"E1E'l{w-#:z%tB l舖nvVhyj\Z"< m}V%d9&NPIf[m3$Qҗp?p!k5_e cj=}9 րh.ii )i?R:˓6ʘHsJfQZƵ$q}nمy*wmyz1S-vY3oiM >iߦIu}sd/_@qHS{2k<(r@ bk x ҽ>.~,:Ak/ۚ7|j VӶXGJPj+ IR{أˤ7:Ok?Mhzkqa62|xc8hƞ) ZI=:pV r'y-*"E%3Cy C8}HȪo\'7K">}aw;VƧf[/]__k#߰z|2́Vs2\bb >`!P<* mU ;\[6B:LVfJ='7@#K$T9ډ[[Y$V-SPuW+_cmX)ZѲ|pt͠ȴӁpr2athUEnlK(B%]8jk]h.}X0]2xw6s8,'oJ27|$XJz&1m4G%)5 T$4YHlfuQ;91%Dx VQDȩQP0b&ELz̖ʼn(RB*%3:Di@FiPj+-j?5EM_ջ}Ӧ״ L.SY64onߗ4?]״y妳ſtUsÙPf}W_kKgMweizӗ:fc7?y4֛7)yӼy>Ь/gp-J6(,Y$fsB> ^:s4a2|>]G;Hw)Ȼt!4ϋ8'`9/)1 [mf֜bKx}r Dez#>ۓ~P -n=nR\毳9A)ճ9=c>ozo\zPԥ3\Hq41z= 3׾ve5= yXu{jD`dzD=9 dmT+ dc\3IQ:q].IwEvI:me6$W`I: sȊ4X&e֯RBpnkFALmO?;’}v~yf۷{IS!cou|"vEY?)\OJQ'1czRE'KRop=sTMܐa:VX=Rd ?! gՒ 2<զb)pKK ?]}wrM\VM/<=;.} aӘÅghb/}R:G>}-5jKz}|V3e-9 c14pvh`s@|QXɓ1U/>).Zb9@ 6HСdHNx3s,NiX=wA>U8GE},ߡ dx3<܁\hiI%|4 GO(XvҢE7P>9pZ%WD61 zvs ̈́%s,2vD26'ۚ eF~ڧ<7ޖqwvyݽn0?j'gLG[´ifBHf^=]~W/IA#aEǕW:M`eQ\rhC wlk3Zx ΖF!sT23 g&YCVN+:;ɘemoHZ&)Y^O1GYi1sr#FĠ>, \9V.,Rk (TEIRɊV $rB{Lpj2id:Á/Km`@AXf˙L jΒ ^N,uNdJ_WZd}Zt,1AJ6vݽ8s䘓ów6)nܰ jRZj)eFC6CV [kl"* u{/} {*<א2)izKϾ:CAĔʥ7KsE(^JqB Ke|Lk:LzY`{n1d00kFnc- d|vęWˡoI9|j߳O/#y%cmH <@k)+- }Uˢ"vg"eɖd٦la6.gf3;; kQ9+D6G0I'.*7c3U`n}:)5uPKإj{^E,@OQ0XcCHtiIIT:d:P8P`+(4tA򰴿Pن!e?4ޒ|{b֗ėG1 bjB~27䯑hlbbt3ct,Ѧ,J9YWZ&;Iy%}Js>f:Z$sH1%s@5X[1ED#jF.8>v3KtբO _71%gզ^']o}? [s1<9BvQD;F2P`SJB;xfN1t ϔ%Y H,â[Sl-h$)y=yYTM>\`Xha,X&xSL(h )PA uS<-)b2=Yz:4REѥZ4Xmq0 ǼVqJ *1cv͓q+ 8Ysi~B,N^ᆃ0`0$LʃhU= T$I#—#@ } 8 B*@W9Mi&Pz0>4/> B-o7` z?lqsg!DX# |"0TpO4 'řCCoLy *[Ho3ЬD/rrkd5b'H}~IO4mZ^Z/~Zp %U0Z}{mG۽oKJ|]Lqok]s K]ozhP֖ùѳ]'[3wZ'UXNUY/'o|t86 OЦx5l,{{æ$ٍ0<]o~?37~ĩX:<9";Mym?Oi_}[/?<<1K b0i+^1F~^f]|tNGo~OÅM% l͏ۓobbE t vtx0~w6M~ 'a )nҤd08\'V0K>]ͥ3}ey͞Mt8O|_tp!wW dW馏Ę~~LWwsW~_ո$:u凟]b='+&Cav_ I Erq*&,U+u{,1L_\1LNG]볺Z]gfmubeQ2)LhiWkQathjD(%?[yr럫I iڃOtjB{f>MOgRw\71In4`]3Au:6~;{Lꫜ*.wh.6 .n;<Љ6\ݖ]kT {3kulT *%F9Wx%8bƁh1G 䪦 w[{*_(kK-Ĥ}gSSf#4Ւ43*K˔j^yB#GLy)V^@y)Y[;e_>},wiz‰;JUk)C555i(IQN)93ZDc< dSO@Օ!X`3uU ;i8l9Ϋ,.z$EhgHkMd[[! z;WHCsKFDoAs߁-PE C%V@T01&B `e!#Q5O9̦:qyFkō:qRYZtGS2WΗ{%.V] >ZYK0X`"k"QK`,JQmQ'}Q >^I8;9>oPzYk v&L& ҄=|n ~u2[}`-9颟S/q26C܃T* 9WTv3ix`,(3x[V0P'K18kLrE AGzcڏ1$"[H|r8qF %@LE9:+Av:JK+N[=-;pFHHCќ "crp )`84xNF:cWPxA=Rߍ gՓot5U3s>7ӣ,6)w}SӇ;5  :-0qZL@SSE$6 ؄T%ѻ0'0a2|ב/җ\ JaonY~\) V^'8! Kz %C65> 2c$of3x PTfE/R<罪@HQPkF.$%USL"'tUt0Sԩ19at|s/D%OCag}o5* 'yh 0ռT#ߴSʐa-N E<"g2E]u Ck]XN4B"uG&\+F `Xi}LbEsuF`^ )Kk1p+)Y2V=w.G | Iq'e|D^3.j _`u4C4fO0{@sCJ*(&>6Ig`.J6$ZU`ee+kvZccR"蓬Yvn; {Ɓ5vS9f!OP.yHzHz+0 Yc M$ %CNT,XvQ#ap)i'*.5F8Pub {'l(F1@1he- \[Ũ;B![*rt( 80 I BeU U> ~a ^*ijO/TÞG\D!zQ$  \4":4B+gϞZPQi5j| -3<)[|YLy 1UWhu\~?ɛ(gRoo_0W }>I&ߴ~f= *o+/@EPx2zי>۫?#~nxve7D"˥8P-0+pgf3fs4<_я&T[7/_O|2#nr(#~n]wrV W| -^wluO[W=U,~A?Ք yY7.3:cn`:1]n? ;_f]RrDP^Yo0:w8믺י<985n}JR=OzB\&`i\$U&kA4+^w3>Xȏ^2KVͩ=eX^R^J-NHuuI@)|{:nϯ zm/zH)-|S'n:ib?8ޠ^$MTר{`N M ͺ{\l'2@8x& L+N I> tmv;1 1aEvt͆ɀu;eFsnLD ?M5ʙ n^xƍ-2QR)sy#^ 4@ z45hZд'Yz~O6 2|3w3,$a 帶BcItP%oϛaɉDp+Ih-:.K4pN"!k* ";AoAvGt4u5&ME+S/fm7+n\]w̠& 6^Շ~KX4ZZWZWqleqU JI&f?9Dܗgy(^I8;9?oP֠ fmLØ0 îW$D!QfcBU USXѕc<AyqG]Džř R{jGZەݍ֘;m%yNVrKAD `:H" zk,<W^i **}Qo4QNyF:DvyT9Rmaot1rUY+zMv2 ΨNqmALh-^|e[s` bьY>_z\Q^gmFy-J 'cbF=ۜ`;xϑ[ )D7[Lr.>J,"dsiP~En8ȘSzBJ(TpZL$fm$YP 4"#Gr %L^j.gIhNt1*-@rV-嬖M4n`ԜYdi0VH\"2uAcd4V 8̻)08`Q5"PR>>(-_DHF(QP"b}܁VXFU4:W9JSjD ZŠu;f{-Mp) lxEoRi &ZPGs*wY-s8(:^'v::<>A~2M&dDC>Q)$ii$]L%a)4 >FyDAr(_1UMLl#4; ZqAd N"9D(oghЈG9񘁱7G#8dWNa^XNQTuYʣ2.]R2 HhMl4UQ"}Ƃp@)EY@n Ke&D-+cPDar\q9^/IqU|V)e& )˰B0ŽJ%xhF+UP"b6ܣ%@qy sf(d܋b?Z [miɂ1DkIRS:A(߶fZZ@,A4t}폵4m(rZFp\0$ehi yAF1RYA'ĥ?ylE ~J' bopAF>?&Ė^?[>ښДo !yrGjuad`+uGv[kSD%HFY}g0w0R{]J`]O852y@8]~j#{~F@vF NB;Op\d8B5xu;\ՒdEGь"HtBMJr!D 78!m^^g>Q]yӳhl}TMoۃY™vi_NGW*fAϜ:M#pq$?q%$oT7\o-<Ѥuk^Z|/s5$!L¯"pgiۡ5Pjho:4hК|Ʉ?.-k~i\IqUUF (TS.mE,Yvkb3Fp; NRa<IEѡ# ib$kzg0U.Gxdp䡯 8Z9w:|9qBG#SJ|q&NshuUk1x&<0;+`Us(Ju9SᜍJ4V@#ȩ-e"cFWA-.MjG4+v4Hc Gڑu Y+$ 7+ KJk^+knH'{GM}0B1;֊ ~YBQ' 8h1}A' $ 6B!*3<W`'8!DVh #Xa*Caj3j2b=6MP"%fƈ-XZ)F;f8,1H`D>r0sFQmܚoK>[~wþS޻2_iytՇC'\*{a .Ǯ=0)4Nɀ{a#H#Z(tkEr)N*.~,(*~e_w3vTfY3`*Ks>a{O*g3f~ {J7w;'pa&¸C4@V{ U߁~tu ~KEE7ˑj?6% xO1$U?d TtU8 dk'u Lēєc{t[sch. )o4˽}nWuV/iDp |$IwI.e%v*ké钶Y!r[ŕIfv m؇\Lۃ'>FWϟFr5NǬV_Մ0EíOjy<~%as#Wx4Z1_{w/~̛<(\3^tM;JSSx0eۡXQ|qSfzTJ>fs0 AH8ʬfYJ%NG@T'H, BRAwpM%*͓;~B,'3$#JrFbfE4d:ĽY50$5IXD|0a ƂHB9UaaR"2 j6r6r>N7XwNaOvXD%wn|nG=+V^[I9F!@ғD%QSh "Q{D%#)iz*JF8S\-&x#32x5sf"(ܚ195c>Mf]u!ˬ Y L2^R-dP._z@aѧA|[c0qRn Ije^ւ:P(&:!U42XdOP& LK`!#a@хT5v6rkl;9Pv6UkZwZC+9E)B2k("hGZ1`gI9~alXo@;ȅLe8f`!7T A ('8"AuY^VF}ʑ}H`4b.U#Qtӈ+%0$2$QE1r 1yg1YEZa@6TPb%#1SA`Ir,i$M6š;w,95⧋[mA/m!:qɮzQe֋Ӌ^\qʼnD!X l@20[ IJEU0}CчٸcW}3C>܃ }0UAp]3E? I0X%E]j|{Z3).2;|DVZ^0h.R=/ƒ.q?;Eu]Bx F)Bm?N["5 8 =X[,Q2HD|f *(rʈ-荶"% 3BsR@"g=n8Jܩ3w{uOr:hL_.I'zի"P3Uj1D)H B2HbW?Xrц`U:_̮H QAddaYȊ0*+}αT9@w 4B8%WFmX:Lk6'XACQ@K2Y+냉f$тFQn:1@69&<}=$@+v{uO5O ]hagu =X 00.U2=yc1TIfKx*XHdcOL}HJ8%26ũ-K!( G2 q6r u5ETs9L1T7K I o6~S.5/ -c1s)`ug0CfGSUz_~W^1W/Uu&I%XAoK g͝M!d{W/|fr=mɿ׽ļrֵ S;}%fNz?*pO$Lw^ؒ\b҅V;ʠŲ[̲`jFS?ŧ\OsM2p `BK 1ngv1;~ngPBTp*) ׶hݷgڸAYgԩ̥|K^-yq|tgԆDZiPӃF=Vb y\Uth8iyvF@e'tGdesj%D!?A-Ě 2:3vf5aCGXٱw{g۩:Y=F |B֌dO ò]xٓ!Q > \YXϤRSӊ)'Z6)* e (fa !a%%Q!@)$oT1G EέRȰ, -J)$#$1wp6r .4E /.?Eh?W^/>'(l=9]]]b22O=R01J:B  BY MYsM_l҃dy)"+{:i8~'bBkjMe C2ڟLGuNRC9I -Ƙ2i7 )$JvYȪC;aicзJǫI;nuɘruSt_KF4p 0wP]T1ў:S0P`SJ"'^wa/!rƝb3(%A)# _89Kyy0,UQ\(4J2$:$M%wh4>$iKoJ@bFigtX=MS` Eph*<19.XJM,2DAAKPL* 0ӝɤxFɅTRi/MH.U)%uti4V[<'1oU JU`|RRϬ9^Yp 1"jo('I)>e1pDXo@WÀ}'٤~X?$".B%2)ZGV(e׷ i 2^ʡē7O72U]|j̶ tO*fl&qo?49ۏ3q$6`ZD 4!}x4`񉻚s죹".Uyjd4>?}{}蓣O3M?f=ayjRͭo_Mr?^Mnq{116|=|{wٿzC~ )T+vJ-\By \si҈wՉR^4cj<^?ܥ?~0% 0I^gbL:E&8'X-'g9¾{WU9TLZ))yv%#Jxun|?? "4^gG<3d>Ӝ&SE}eCU#dYԬ+={nO G5c5ߔN6E ]Etޏ,M1^56O7@1,>NY 7?W5 5r4 ,L b$4| zaذ\3=iWͼJ=hwۙv5ֳV`LxPl~Q00+uL4_XM5u͹-94jW?_RJ"y+jaϗޜw|sq9|?hE._^P7Mk烖l\q?g7g?>K=8~ne8I}ysV>^-X',Y% WG/Y=]rƇiݟ 6/N^WoϹfl+"\DHbU4>^o+p x]0͇:!6=)daAF#pX҄=ssFެW//ݛa;%jؼtK_NnovpllM[.Dpmw4;ʺCv&wx*{퉎Hn’6'HTƝv7O@oOڛvk\R48; ttԓn7ZI,[DN;nu/.CG`ɳAÿ?PƵU4Nʗ^_2wmTH(m үQ:[<ʹTk g`7.u34&` lz%@RKGr"Q coOcܕ$>A!4tPh)0~\Vfrd+}֠ QDČÂ'o G[?d&=yjk흴B4 32ID`JP`Նywf( h\[ƨ*:X19r kC>x{M Cy/Γ\`D]A<  xfC^zj;c`*WthER[M'RP<o)tjT!dbbw.ZM&|hxK&1tL]9{iYc%G_,1bLYZ;=7^kMA\8@^[Ū-1bS9~7,4S\z0d s;bxsv'wEwә⊁\ϑ`ߏ6˩#Qed=yL^ﺏ}L6 1P}o{n%{ k/˳- +Pu{sHum?̵Mg]Jl=mʉ"y}v#+ե輹]XsH&s:֔Fj.DrV)mz.eNl56iu{cvo֦imh[b:VKkAg$mmQSz3'Ik,dH:rټ}+?NRΕ НZzt &kGHʵ`krSkܝ?9&Tnf`Oh^3նآZLdf9$xy Lǣ5&k>w,ՉG5)eUACzQ \&єY,2ڐQ[](*5 :݀-B X q9wՁdrDj< _+$gT0BsR f>Fie":8(9eTY2>2@bۃB6j/`*Zg8 OKa=b_uPO##EZPEIk!䄄K}2ya r u Q/#B&!j+ Y $n 0o9@CVHHWJV =GMƺeg5:/fF>@$5ZT^SelƶJ}4o hl5j&@TQx nGe6᭻ "YԏoT?/Ѵ_E9լ p6YKeX8Tx&z۟ASPP@ )ǎaz:x  ^bq% ;e3:Fp5iuÑyF4 IDe#kpq%cE1pY [ '0qVc]}p4,[JH'JdZ " M;2B:kˢ9`VQDjK,[wjTjk];z!$&O h,߬tqŊ Hy5!z[a!W[uR`c #GF٪W4l%kǜ녙#P%d0uugF6#`tVTZ pWYaw ~)7,zOVI^5JV֮ n0 <5/{%o=+HM:%9UNrHV]~@rCSAAv0XgRF6 )w0ց[ ;ƾ8d nV"⵭MsJk \D"Cɟu7/V0 .-g0FhQH0lnցG In\:h\SG4mT6ZcR-R6 "7+;dҁcZ RӪ $_3T)᪥{'ktzҨA^Ȑw7WC5^5΄Vup$k6VP4kժH˖*3^tCd#Bj?H(AOr|d8NҳrOV I9 LR=i ҭ5<H`*M\\9Bݎå10r,gef hҙ\ .#!ө . hZPkBwMX>uE> #W0@5Kd!}s7o6vxQeZTV)ʹ&kr reަ@H&D: z]/(I/H_u>>y?X͢-Ȼut{Aөf9+ծreTHaPEjϾE\)vI]Fj6{We̘~t"rŏt ttZ27 &\K+kRVQZpv^9&f y"%zX/8u&#Nz2ST{pCzn᳏)d,F,j#k/98[`@I=ie[>VK3;8/.i&"+8(j<l]#8{ԟ֎Z;4f|BJ82|B4hvBk1䋐D eܒ)!`@)uK &\vt Op&͞䅫kK^B0o-;ʜQ(]^jv֚k6ς5,-Wz 30lVg $|帹aj_1Xf6Bȡ҆5sF0y;gxʜFrN(R璥>%Hc .(Mv o <[g6.`P{_h 3S#-y.Zeq@jrjI=so|ڋ2U6& 46 *-p7ښlEbNIq VU8Q|< yJjypBfixNt1^C(p{12U=S]Yڌus]%'F]!R%\͌9idMDҍȉ6x#qܖel秜|'WJv,ZSՊ/P)(E^0T$ :"D:۠.SH҇P5AVgwY|KorPR˄A"hQlb#\yhF7:.8K/ _ĺoʎLˁAՂ'X%eegekSA Q[h+c2Mj_= VU,<&N{oG8 RfL9ǐŭafɫY sE;&Ξv:ئp)liDBAdB$ьiT: aX/6*Yh5MsG~fW4{-lSj+L>D?L وBcT'kx Za:< L*wX]q40*!K╂͑+gQ.nm4A ^P`(?::c7׏t+i|yOcQ\ :Oݛԯi~'_C>4 $"P$B.1TDO,ӦhT%K5Z[v3NMY=9㫦"l||n|-@Y\ThbJ2{ -ja` 9]hc5at}{2QKCQ:ėX574)e$}R˥R@lEYlH&ϴD(IX&,l@ulMiX<|G $ё El@p[eְBnю D]*žm3De 7Th=Q3*%Ξ(7=+bGe L:T(>xiHA)$b_ggŝ+%M$l Z0(J,[-$T{l " Ӭc 0늱gc o,d$L[vP$PXbbzB"X3Սȵ12&xa$kKBz5&磤|]?cfv~.ݴ|0Ŕ՛{% j?|,Z1 )Z!k8 %G,[sh p8߶͎Zܰ_0 n|DtWڷ]J~wpct ۮ{ů.IvdGs| TȾBnm2mSڕfem؇!mEnNv ꎁV`~﮷,;,UsqS]lflv#r6Wl^ aL6l+m.΂O7h'E8i{{Pc n3u@٘|Ĥ!XB pZj!DN \x+Dd. fM"֕V抦tdiOH"+Ƙ1beR#rg5~y9cvS$^2/E0Z0PLϵ1*$CAp 6C &qRުіTd̆br,Pƃh%7FPeZ*j5qnCXNâb=*yx'P6W6jV+[ɥfҋE'!gY9c@sg \0'z+RFG:L̜20L0Rl=.Ep:EEwJ28ڞ8=cX/BQ Gk9ӛwYNt0~7~'c+gSR'cNIC: = Q璀+ӲV* Ɂ!g${.2½gɕ &5Qg툱 ̪R&A8= Jy*^vUe-kcUC8EiKc Y%XVbHRzF ~r",.4w ȐʴPT4HP,1NIuҮ6/k5qv6_-d-(vf)5HS'Iw 1jq?=U"?m޵7mc?-m&ngIiwgQbK$;I;$Z$tI ~BޏJ0&}5tycM:=;\|f2)MAiHdw<(ny;۾'ypFϘnK7/i Q +j=BP(!} 3L\/s݊|ҔX=0<LjoTj=H<=8Uyb HHCќ "e6bO07Y/ q0ҹuW#!:!UJssU)5eDhA #PC"eL{N5z2M6J}|=ޖX';y#޸iKmL%c~}siy%2DcA0Z`P&(qfVYLɥS$XY :?76V YgmU:;yo) zEYS$55KrDz|3^uA|9NfV,y D)6G~G &80aˤّ a)GF&74# 9EmӲ' Ƙelr}D \YibsJt9mUbdiS3_'kTi!n'2qi?Wyeq)*wjٓqj͏=3ޙLFym8yXo&EX#TDY)L~X*dUշPG -݋ӁvceiN[5,Ko7vZesX<,_ 옗8#1K~yruWcsLr_鳣#+S赟|=_*t[ml\V=KSBW g:]4_UՆİM:;vFCk&Jv迡Ī~h ;1ٳ{6jZ3G2#17fGlbBUtĀ篲w:O'scmhZx@9~:qsDWO1KBj-3rbĀ\UfE(`E|56Č$k?\ 'T IL-h}g\U"tpo%VLvX#$ s9H> s965!XsAv>21$mTQ{+%rPU+Sv˔z0hr@XcZ 6uȀw)$>HMLm7ӡBr"fQ>Q_i(iD@o;*^,-qC`_=\oRn(zou)YEm԰hxeZ6.M`ي4Q1:#yE0r OQ?!TIvT tgT|@_ҿ@]|کô}Nxz3;uIw/v魠05RTfR+ZjO2oM9"{1Ȇ;LY b)xg,a{k(ѻir?[,(p /yDu?zN(Caˑ}Ԥחl'` ] &7!B*'2shAA4Eˁ\oS!M$r\ "䞲˛p^ª<^Rj,"|Oi[ O8N颡fq%6xDN7f?y\e|2*Ϟ +5Gv2۩M.?j}+/5v"TH󹹹Ec_gu#PbdzE.Ӄɰ:D8*m=F{>F\XjMtBVl-f|ocffa3v1a9GtN`L?h. Ncbx6ح F/R>C x,|mfo1q? 5}\qZr6.>'bԥ9F+s]ՆD걙7/_/>Y'0&*UO75*./5*/y2X`AƛgcDx9ڔ!wQ282|F2ha^a0X+L#OGkI#JS9}0AB1v { n*v?tc$CHh`j1mA1At >P\p$չGࠠ`SCrWȪJ?/X!zZ]}= :m8<LU{DTjtb_,LUҬEw^) O ?+t%]eTm:P_o@R SYO%QVp[>^’5n&@~vH׹H8"qD*e1.2ńVrsehq,WVJ.3Ks`,63?qRdJa0VIXN1#a01 "_f쭯|4&(?"5KӢT9InUP.- >E'[5_MHWr4Kh*2'bƙfX25u֋i\5b}$ګ0i``0) 53^n($wU22sn8I~dj [~"AoUa 1x6_Nv<A+W~+;&L )1`֯~NJE\q{18eJ|=_(=< 33pG3%@R$B G1D3lP)KA[N >FC*OnhwUbwZK 9INATy䉸9NaZuIGJk" ݜHe 7 zMd5ed+ꌉa6*5k(W餽6n"^N;4,=gm?hhEYWL (׋i*!@g0k#Ƭ͈}hf!.jDݍTlZޖcFմ9kLۭpg'ZܴOfYP dFƍam5@` *:qV]L%UvP#j0،WhXb5K沢W0tqޕ}!aLYMb5k(ү]M8E=i5iN@=;[5c=+Yk ef&jƽ: KMO}o_`3 !ݡqK Yi&r>CfXcA&n`؄v342&\#&WÒ{&Ck;RFS .hXj|U>rLrf9@Ss.35^q2+4W2 Tcch복^3֒/@o3Թbj3}B1<I2g:)ZC;]o6rwo#qW%-!i<0:U,sM~fPKѫU&֌BaEƮM]Q0jtaޯYM?-;7DaM:.{ͩS#.+՟pvQsjd\i c1F1jڈhn#&i: #)t7X)ۇY&YqZrw0k#&]̺NƂCYHؖEP~I26ip3XkfoOǷfr CU8/S.wG)A@h:=xoH\]"=t|IW,G4MѐŅE"傔dlj]_9180LeƈK3(87hw1+Ca+x@Ώ޵6#%ȧ[7Y |8fq3_n =ؑG(?bٖcet;bg5Z׍w8}vFjAɮ#1mbvJ<53ZfR: QB"q' H11wH}Vw^&2kmMXemɞmy)6_lDSq[~6me)P0ZgY, )rb ',]rp9A79/e7՗׽ /;!+.a^@;鰝xVڃO/%Ɓ"'#_ @XV.Ki M{MN]ZЄq0lp,!}|ys>|bV/=i҉Ezswd7^n|fp8-˔"w|pB.~.(t륏2\rp)Ƙ:nn̄[;,s6X_9I/Kkx&NR|1gh/yӶY*#f;ej'Պj-o^oMvf7aǶ ZAw E׳g-ͥUӒ牱b s•]_DWgHn=Sr9lBR]痟~Ÿg3%: fq~` M2f?7$=A'#ަY5E:ė١>p҉Iвv!g$Q/6'_%{qCRԟȰcu V\qVB{f͍/iNn#<c#Ȝ'QB$cF!Q93 izBVV7vk81ו*Y/Gg:]ʙ_?N.yyѮ KӡzѮ(|^AXV@kQEVϜd\sc13Kٿͻƽ1Hl$#@Ď\*"?}/jwH\X)qIu*m3+Mw,f"Rg3xYuєpg:hr874Hb<6 FK O\†&2PC%EO31ZSq%ƶP:%-J܋Q Ɇg7dBw02ɸC=)"^tPvl|Hʕ@sl C o8JM<b 3?x.^ëjǒ^9'08r2d80ަ(rh\%Iq}WYG2o* `7BB7oPK-faqw5O0ýNzC-Gլqxmf룟jH;[EU2b BO8JФbXu'&!\S!\mk׊HuSkT>rDqmş|,Ege=Tq-=OgXQLmHP|AV&U<+n6/ !9(Шxc/zRcL+1 Z)NU[5,5O(rJZIr~z0"h3::ަf$q E]3zXT΁0&v5Bv!+*tE .Xj(kʹ] }Kvęjk`*k3I5w v{>Pɽ䷲sez/=@ ׬e!Z@Y@R(kCʗ7.mC 8 +$~Xvb(6&%]֒Cz}a~́E0-;p%R x.r%z)4L(sFw~c3rߒA+cbaZJ@#Bt7&<:b0p,`LdUx4?,Z6/~y͒J++5OiOlYA;?2^_叇S(tUi0)gK #1, r<uŸ&`f/{m^Q=|F hWY(IDŭ,ʚ7圲4Fq!`#Jùpvc EZ潍 oaV۹&Q4IbRr%hg@@Kkټnh)~8}_b(hB45R BQeBS'v;ޱrqeL6FU;u[tOI1IOPU<#L\f5py.\,9!DKjA6͒ǞUB:.T^Z;mjj6uBw Web Y01*(*xD[O_b N 3W4bvH7=YeN;2Å)kl^[٘`d1ڥk1*@U{b܈!XF(U1d(׈E*GVF6hFsn@qKvIbCpvA^wkr1NڞZǚj)#F_>Je񎞇{9SG66ý=9ΰek8>?lBD/V]MD{ jTv?VwJB\wE9KV4-I}Dvh-z϶l-y&ƛPc8P:(Ar7.:sNj@Yj;qV$G纥8|whGo5n5ij3xRvhJW'p8h[,`ٜ5gY?Ġ5gTe4 V3$eWh0TU DU3<_\붓yc4|SqQOĵ}ێ]w!%d,s'=ߕiq;k5w>3$ΏsF҇aEyM8P1X?jb娡;V+wł1e& lf l> ]2}v(Q`٥w@^j)\`uL{8M> G{R^uZ}`hDk[L|.BxAW.JtWo"NiF⣭am' 7DzuYh`ڍ}j$͇Rt2[{O &D ̥i ajlT֦mMq dx$=) c4˂1,fN|Yߘ|ѧ)\Wx>+s"J_|6cih3tV +1qYhp;gc 9O}}G ~2١8ypzf!^P5"VC,CR4ucj!bھ/MֆdKF4cWE/^LRo~2|PZD]+\ ww^Q^;=[AY(fYDtS!<1QJ"ޏ`F5Oq*kב0T908`zNg3-DkI.BC*MDG;{9`PRopS+bD$2G^ EƎ JDͬ MḶuD9^֊N^|XZ{&9cŪ'b(IA1FƊ &;te0L/cr0Vtb#/ F2#ϠQ%F MdkvVP ,Zv5= FZxG&+w&jÝ5X> ]FB l.gTImw1O!3jHfZ!eJ*!Ўw,AfFBF4wj1 3m18A:) HSI: ]vp!m2 rRJǒG:iUbm8HM5 m^䱎pT#Kr`{@21/1adz_&o.R93)54ϫZ$bqk5m53$R+y%^sJԺc6{R2e͡Oe?&W,-M) \Ed!U}Ayͨ.foNY%HgzG_ͮxz/᝵yYY"2;DIT&8` i*Ȉs O `Tܠ|)'wL:-dImO3> t#[2M^ ; Qm" D[/6Խ=H1mjbV|X"Og'b |_ D '=P6{C|CɊe^4Yډsԓ@#.Otb> Jw(}$vv/VS`bd]Iȵyɂ|Cg^{0^"uNC݊P5P=D88șSC<|J})1dӄ.*y߄ԅD I)cXSK |b#8B(J>'=Y7;!ds~g#m{ !gM!1%T1Wab! gabB*F9n.Vl9a9U">YkX%N;ž}(\U]u=T;DF/䊁e_QTCy{5qqTwYn(wſo(WdWHf6w߿@=8{A_&$biӍ V@60 9 xxhELYb4V4֠ @DsljrǃߦKpT۸`.JIrFXl,![|_#_(.PbYz"G#"bnؑ麺g82WxVRpgFya7F aM!Lψ6%sJC^-#+4\K/E"8?OdD"(cIj2%-㋜͖,mَ-捫U(v ns\d~b6N&CbT`؞&bkI}+!1aEY i8b( ˱m8vLA<  O3ȿ =9xv;朑X#>E'Al;hh7"Mm܃i3N\-\nS `̺>Jkc~4ͼvIvޚSaբ4i:F"JuF.Vk#d G1Ҥ%Ii~8n}pJFȊ*#,qQZhmNmugd&몜, F1ȶ*vޒyEIcϿc&Y%s|4\F~\(r#d\e8e0 @d9ijaN5X "ˬLNW2/Y <; ճ1f^n/ɏ; G)iyGsZWlXiOF =xsbYf917e]l ؕh. `]]~ȁ\7_>u%d'c3T0SEN2ݺsFr^CA;83xqv{9̱EmuBc:l#S0KY|H:y˷ 1,%s0\3*v ׆*nTu;p*#*ޛ~SHu$QW2BǨF*%H58_8d^PʖCM#FȰ94ro- “{Zkh~,V,,t;ݭN`(Nг:zW}^~2<gBi=7uLs5 HJL}};R-F8W񔎧ف(桿x]oHW}t#o4^FytFS?ZJ &sjdTB/wJ5omBv(pmLk1p[CQ5]ZuP2ןbF#r;$RB׼4oNx O] () 1  #^1&^3OQ,o~4JO ɇ⓯[AnRYҙu9CS I o.WOeI A͹nݽ8$byb N6~J2KaLš4/e+5xANԊ \q6s3c2J˧=&V Wb atVٌR-X5֔I(9 [ِ(,\;,귧/I+#dQc C5{ .wr U={TVz+s]%ĥ%^y٩/@-R (AՏ- ryavU{m>H u \7lq(N qGALc 5h^J6PZ-VCp{wl }CqqXND8sPb>fU|A(LafMH~j B}NZl 3jr4MIsf^\I҈Z[cmrΤ [mFZ u %^iC+Ip}0i]-KMƒ eZ2@c*˦: 5-/A^()4uܣH % iz H B%= Lg<ʰdI5T+rX{;ws)1A(hq ԏ“,nR-1ߺ`BY&VZgΆJ$KR2 #24R}AMKyP&Se}Vj 0>:#V+FNH|{nNJ#d$e<^nGP>3S(H!0Ape3"'Ur>bE1"ⲉs.ICo<񁇘4ΓJ˶sHv1XI͊vd(]GiaMGW܌l9n!i /{s*f,3xR0HnPIؔdB 1 AD p 3̱\g0enE1TW0Ϫb{&M[@ҦmGV!^nkdl)4[onDƞߢV~޸UYߜ%P]il_hLd,iDWٛ3N)㆓Q5gNʬRFzؾd)T.W0b$FZe(!! u]9탋ES =[Ң!#Q|BBDrB$Md^g)HةIMQd6ثKDb1`5dx̉.5|oWH(e\FeG!OXijFs;]U"Bp^1*vF1ԝGwAy]2Y=;ߏo <ܗDIz0~F:%HeW' ֐'P@RJ:2zQo2S y{9_R䌨T2G &xGģ2$K6*:- RnhC]'Cۋϔt߻v՘c\GU55;ukCjn~&@bVåbu!oZR פkORB%}2?+i AZh%eh<>TF9 e\c4GZG|7G]= [ [Y;C0VS:[@M_ěuMش4?~Dih10wjGmW/QCׂbH{m$UaN#Γ;/{`">A5P"$ݪG _@5SH]0D@6IڵZIx:yaLutXh}Vы|=m(Ͻ9e%ݾBoc\+jRoB 8TiL'Cj ,\Yˎ?A)e,4=X-ZUdsȁ*kBI6qΑݕd v?,O9x jz`L-rf{_;o06jXM@8z }m0#Ӭdώվnh_[nOk?%d>q@[mLNmv{C/-`nTWNp5T$AN7;XI@ Ra )҅`wFD0(p=6V(&xqgьydpTόCϖ^kRQF%f ,}7ZD ,ZT^םT\"ܮHtg~b #\aWQ!RF<*ٲ:/bSQ(TlK*,\(jp_.U%%L15%4m'#ٕZD#H8{0Kcz%kz}^§\=m 5m&?F̒dIiJb!Z)}>IJ(N*%3-TwdS\~.<%#Sfۓ IQG $G[|ճ{%:Vw!oo܌;$)Bf6f;Mⵦ78Ռ07⳺L8v-S{Ƥ(^1ovĎ$<o_Ȭ{Mh a3vX1C6أŽx=>M弟彛.xAؼ[,QS{y  hս6d.ܺŚ x5xY@EA+uiFOT_lQE C TtWkaٚw :Ь‾b?>jNo!*{jw/;[ڦ3wFdgqGۜ%dg\͌M_7,U@7MTd܃}uUIn_8GCN&x*=ǯѾ:[l4W 2>ܞԍ0*W|ѭSz]lEH("LC\=9װ]8e32_+" zRꜗsEkhlRǃEy)*֓ƐЫk=cR=cb X>L\+8W؝ea$!gP( `mm^ ӬbWo8Bjo̝6Pw#U:: TK gyw}Ki.QK("9oM']OHĻvauFN7FBОcU/=Y>ÜveӇe\.T]CJ2sH%A"b4@BW1Sh5̵ؠlz@R&<ǒ0V%d\%BKGGcJ1q0Kμ W0(2DwCxkt)U=ÃH_ `]{ t+#vRw}U ' 2.I b`@SnC5^>ؚ_e)TέT^`7.RB:4I 1q)TvHz(Ɠr߯?GK=sf\;ild2,TvebX,{ga SgNW]Ix%A:M!+hfa6&S\Yz3dFwrJLs+t%V|i3\ P*0H@tθA Dc g  ~_D| jL?[Z bΠU/5xiifWU ${臽nq>0D,*͠Bp3PC#ؗsoy-yX>pSԩ5$Z`飚m,O*7X*@SwLRw-xMΒlL%G?;QK"GT{[0_ k Gwx#.Nxu>{ɞBK'A cͩe XdWe72eL^W"E`ܻ/AќsF<8 &7(NQ _-gGRq ^TO7E%mRUW6[XH?7bF$*J]46XmT#32ÕS<z E @ [dL3)9X"LזCͶDC题f6'#3*ɪk*ֿ~F/?V3~ f{ mbBtM+^C6VR cu z3XjPrA:?mEoaͿeSEgIz=z:/_}˾Q IP1/^a3j7 뻌c}X/KM+yەN# Xisclhf1TCM?~7 חUQ>.;,VxbԖЗѯ.xҺQVR&%M5ѧqq=ďfi NغI( UhK{xwJ ;<{.fd{EܽkQN}-{ȿʐ +hEfQ,]-I: Ϲ<^/] "eY0UMV]pTL& :}$DݥD7 )XBT.: ;N]M_!D^a-ˬ=t^I Eꭢ"5ɸqfU5fkpbayxǜcEsN[ؒOS__aΧt q}5h䘩94“kvKхBZu7I5pАE2Sk˽=MU$'w Vʛ+K9H#+cK[hPnZ+uwc%)^[??b5ΞA=7/jIPS@鸯B+)[(v}:UCSF L-mb]zLj$o+0%Xa:`&fIB3ر}hyVK4.%?%aɡ'SȐZJ4:Ǥ>{fH!'liAtZ m<˂?د"W\R*£<Y:0˹?<$0/5[LE,^MFb o;f?bcDf?@< *jaF1q5 (꼴,Xu52˛t7veW :bj߼0O~eʟF3ޕ$Bie}Cm46xZ,yD&e%{0}#IJ*,+KuH*pT9_fd1yas6'z i' K ᷟϗuDXƯveVH 9WMfp*JA6 a>Yn}S8-ήޑnd˝Xז9e_ P&yE8-~ǧ~xB3~(ߵAg≯Ԗ S b!hY$Pf4+xN$G?mso>8س c hzAg Q * &$@E? e"70NӀ".?[f]RI}wI!"\j`&ˉrj0^)Ď"dA4i9Z b%W1,JX0Z܈ >=[ԹpJw;luR0\ n2PwIf9&a>˝:Rs)rc) 5qЈzA v6G:B:%F 0^ m1XK{Xh\rG\e9o-#[[%=< 3򈭠hdPY@Eb=%&7xEho^Qq"s/a@euiN297%Rq 6L OyN1F hϘy<(JP s$,|5E  .5AטΣYMa׺oqGogz#7[n[F5z&䧰b*ޟb`c"*&=5]Mvy<\6z35btd># [+nL홌ׇsَȃU:KvHFdE/xm'\6^hC\NZkNb%vx+LIUFMAA4$!=7X+$ D22,|8CXZIc`m[kw6R7]ԔvE-p,b5Q1@$\Ja%9m@Q{.-q (wl8|wev53p"7& P7_@Ge`7>EBB(kB&0kii Cm),[H={7ۣ`7zK}-ۖ[`@qU)v6gt%)hF8JӓCp8vtfR>[kήϧDv Te6kޝbc@"{}\1D֟| KUdU?Gl}ٲpTwz.WF],eXu Aiz,Y5auUԽ`u%ISK.GMR첸Y=}A`9M '= s`#8Èy>9Y:g/A:V8axjwFɨڑd c-Bp>]]Yr]zʵ W- &d>Nq v%=ޣ7HX7>Lw8T@\BB1r*OG1=t',Ah8kwF<5 ;3U-jMZܖ6Æwu\ω rF.T5IPճ|ݞ{4獢WO~6>9g0]ԔƒcX/]3,ĜN}Obc"f%GPR_T8Iݣ}ZL]d.>"]3sc3cc`VpBHܖ4pLRޓ<9)3+hٔ Iw\iѣ8b={L"w|Ǒq\aћ* ᥤ{x"v2实W.ǚ"0<_ioy2isեwbqYk\#zO,8AXuȌ2-Uz9 L,oM 5:d!%+ $rOG o%a.] \Q6Ĵ'EU$*h쓝EijI1уZ1B lns\X±`HƷ#me0`dzҺKpbz5+[)# KVl vdQ9bb94FpF(x=Q;eD4杦g_-fS7}:rQ%B{Ԛ+_jxoy?fo=BF0UyIP]f 8tڒf:s<4 WLjnj{HŬ*~.HH!{Y)e:4UC潫#k[į{Ĥ,Xa<^xsT81TIYلǕyWlwDQћQf6 .?6P_՘QAc` @* %5q$M~ejriZ C8d)߱%!\ k_+h ɐ}}x[*峺K $I5Oq*ly-)h ɿ7pȕ 1~N&MZNC1oe{vt6! ;ʣKE?L`N`b;c5޼x+#S~Rn߷ceLUޭ#woܷn'w~?1$!6WZn?ӏMX~MAB&s`Wͱ3' &3x"l.,܇~.Z4k*x',ԣNRv;ݟaKRSNvm7 o[uewH0^JlWS(Ҥ[?'uhPm[JJ}̂zAc5&X1׃*űW~+>0Զ? YXԙu>qcbc]+SHOfaiOP|\Em;iҒVJ1rm#ye)@1T*qE}}bνaYxfM9/tŧEu_'ki[wha-|Me7RkT'mf -byӺ.1h/a$aaEl%pbbbÜ#DrN̿<0ۓَm6{؂*B&F:H*0<>q؇M Hn@cMMfsK`sxu]=jtof]1;{b\eeH5z;W: %SlQl6^Q*BIs2f0`׀ྚƟ'ȾIӊ2&>oޙtl{C\v5X˰ƒ@W2 uUAʃ׮M-^y[Yj(qQj/.c'4tNSh7F.,e PU8 $>yF. aԙʵu gW7RXG~ܵM򃀣uI^ݖr*ї]IITVҘ[)a6s1ZsIL\~fqq;cR~?+][n=~\ftzv{̓]ZL]xfvB PgqgH2M*h Dp_Ȓ#uq }A|c$\D3i:<[m]&+zޮ/jԣ}qdAsB, ֨crQ|U5 U8Bpyq޸zvDϽזK2qKoЀ[+~-I߯Z'OCHqL:.i@Ӭx6}Vrw3Œ5@!]u05t:|Z0-:mwzK;";wm^LA gJ"% ج=F 6spiɌ|,76JSz%cX3(VnmGwZf~&k)C+h e w]ݾl^|ڃcVҮHə S!tRJUY.ZrYwT);*ʬV @ن ҬBԵ?ӏ봧gV}J p1i8,]޵u#"ew[E~J2$HYZ%[Z'ukZinي,XsaXXoU:l_"(ܰq2ox!.pՃ|mx˕6ڰW1eer,5Esf<~k 1AOu=iޮqP?=c<k-W`5j];Hmn[fܵr0֐o0U ߞegS*)SY,3GiczrkgLr|8op Bo@N?R{C^nzaG,+oGî3fМɽ?Z#:NU-+!U7؜˔D!Vn1n'n0b Iꐑ+G{h(ԣ0`G EK ;QkLAI=F삕D֓\o_FMh!jgoXkN96Gшx|!5wA8F>0ь[wx'%3Ƴci,Kuv9W_!8Hi9NɝJMΕΎsUStT k2ھ[x?%_#gn0ڕ&eUBw܊wMbWתQTe'řUr}6ηԞޅVNfcuv=yk6sn pvsXF>;Z.!Ih?IR[v I'|d]k:ZKcMdp_=x*م1vQ1vNj{z@ͥ3Byhv%&yXihe%e’MIShC+rvL Yk8lY>afXF۞ AyuYt,Og1녈񾈋q >?b29zHm56^OӍ"I7" J-둩bCVmL6j[^Ɓ]h{ƙܾ.mk-FX8Fnq,dOLss0BZ礴⠤TS0R]EuY^9ڶF2ol~=z1?Y /)8V1B<?MO>aa\Я X΄d(J^.g^W`XԀYnݙ] Y6mFSGwpRיv|/1baz<Ϧ7qS-^IPpwdj 7lV7z9؎Woffr)u_ΆO0T7GS zk&SkνpaJaƤ LtÕ1+/Žd}#0n%U$k+heU7mݶRSRvT+.y,k3 ޟS>_;頫ޗ} b8-ӭ,,cnE-!g+YlfLUNR xfp؁Feъ$1@sw~6?;%H S"e%/޹ARؗN_8$dx?{W-W^WWwcp=?<ˆ'u{z0녮fW{)|͟_ų_? %%9x],tu//s7ٙ"ib4ה\Nh#wG03?y ?:fRlqגY~ޥ9U; ]Iľw{1›{ЗVP2M K}Zs2A.qwWq8r3UcmmfՃ8+ٲ d!%2̈:R]lorh. n#f9:tBn_A8uO>S{㑳劚"i@uϝJʿckfs0M{yvC꧵'/4DXf ٻ7tWGV<䨾KrU[ؽx%af;{{8zv~%\ݧ[/٠}v[^VAkDbca Mh(d_[2Ю \2y&q'D_& ][fz+@ȅFJFes0:|VXf`);.TձI_vf?Wr ѪjF`fq*R)5nE 3IUlUU@}D@2t +ʴT0κ:HiX.L.@)Qf-jj!h(FVG:"34ҭjpQ3{[cMb<rMO?/ k,zakDF'"DCD5a9qH/WZURl.,eխ,inbf:j|GXFuh0c'T ܔ85>(S41b9~YV]ꢗUX,>D-D>J0m{sujONQPb Mp ƝTj/Ua 6C*at -A $ksiqn\Uߨ1|jDw)1f5pw E* n0` >v\\ 1v}m%t xM`Var&dIRO>6 2U* ց{TS5!YOǟ`4 6MHZg׻o"Al~*}ejR7|PU$h&*eaؖ0(T='^MvX>Œ}6ɟ/BAۓOJVIxW:(!3,~`6HM$Ҡ]ЩF2ISrcW%ٚ^˻P֡52(ZSϫc7-EJ<0*5q m+fmG^oܣN *\Sql`h*tqIImr`II$qZO;ZV) H8&֣@k)q"B~̤$- 3@h7yRLmp{o[N\JBXѦgz8&n ̿p#~d0Y/oz*wP-kl>A|W 1緳8_ S676S6T +*~yc^zfV7= ^Ṛ G0@^Ic}n)PnYf;7}ڗ/wJ1 FӄOO@9ܗ#3u]սSҞj6ޒ6r*H!OIxX3%Ӯܕ&"K"PRl8F B*H-ˬ^o eO83ٍÛ~d _ώm4yff kgff79Qˬ s@X1gj7SU|CcA赟(?-ĦZy/'̠ESw~4OG#olz*p4&3?}8EΉD*&cOySWKݔ=˲\Dr.⤗!7(6[ _R4`hy -B7Q覂+9R|鐭ҿٞ]uK. 2PLQl F1 % \162fP}  LoG>|q~[ *|Qt˜MCЖ1,Y㗔 9tE9C}ikcI`z`3q,JrY 5[3zM6HkI#s{Xh>(#fuĬs J߼C&% ~"Ă]J{rʺf"(mJ>`El%Rl:P0s |hEQ@ųM-d.4N!lz0n4V^aNZ9͠ [fsȰ36n9x{F,6ӻ$C7gSpĠpz6N&zwۖa9_*V9U4DEEYU~lwkBՔXtG1ԉilp#Yٻ,vTy >_juɗ_&\A;0ȔrPѧ~LW!rj,LJ *TKfᲕ?Әb.xo&ғ\ƛI5Bmr#CDu֚^X~UWQ%gnO Sm=.UqE>]^i?n~7 yB8VZu=y?rK/~BbŒu0ǜ~vv«\(]d~=Z?L;6Z7p{ \'nts z/4{\/Wl[/Vfjx}ӧ:Cɓ OpǦ?o`K( lbKc [4eG ci$D7;&w8ED !į dݐy'PyrKP;rQoϡ] WHͫ+UGw$D-*XVl7m%SС5MBYLs`8TN3֌Pɓ ?CXv\:h*g+-"g[j%75TMͦD:&*1:&69'T߉,N%u AH[]|~ec}gӬ<μT5C2us 8V o@b@HVDa[w=yl"Y0;mmJ vѝ1fxklx=tȽWQrɆ)*攳X-d +?vqg@mnכQ6dV%$ xJF &w=O-c8jJC^uO{?ٲT]8V$x!w'd m-\ZVFDYQE7)$qu7tdޑ~{d=\z=~u 2+V ?Yzvs{] UZ Bu7U@&ّJ>*U39rM1WI3AD0 ?ɤĕǥ?Wwc܇ 3j;[39"E402؀,9Fh960B{ar) nQAZ/:1R"ᰫk Dy/maq^4D~f/BԎW 1MVhm 8|@Sy`Iwގy6SXsl(睍f+wMНͣzry6<ƅ ygiw~v72)4͎HKrVSރ~ ?k#08uϿz6/ozs8;2 "z"~ȭEѢ8G?UT/OZ^'~i!F]nפEɫ<'4w\+S„T(|p[t7F<J@hpڏQDRTl.mşB<_ vDN6kZo"'@}K-9AX6?^[:S-Jk&~`xЍW֔-t/1H] )BDže7`V:Ll)Cd^3ݓ:'_(_1Ħ߿CJ\y`tÛ&SoP{_*\U Ԫ(CJK6mIqΖTs[EKΉ{x+ԅ+ZB7LX 1X!Xc(p͘#u3wT8_Iԏ@ o'nMy(籡^CG 0x5TW ]qC0GEjG*!@^V 7\`wۺ<|kLZ!֩B>Rps< G[6Gzo`e\Òds%^I_Jƾ:M&SS?+L5$rıe~bFbfwb"]Ϸۼ+x剅zw7ZBB毅kkJ.\>VhhSG=8{+r9{f3sٞA@ jr !Njh EPMw۠cZ^sPH">e{$E|?_?_M!/?~G[+"HO^vgu/Of|Ǎ.ه&~ <F۾4n5_5Y%9>|8߯aSbD5qR"Y$ eb"֮R*lr>4'C-҃BZe? j,{wԔ+,wM7 `*iSS.MsI6j3u gJaMm[*ŦA%=rMLv&RrJ4:p q Mޓv}j;?຺Zb3r"4#܉H)8K-9 6it˰> 5;t:VRa?<%[.ݵO?D=1NexYy;{o7ʗ^Қm!/$q<wo9w8x=e.7ʄ?UΰJp~g"˒̛)r 6cyolQGVJ lAuAhvM\H?Srs^m$!i}&8;aw&7 ͶI~4n}dFRA $¬ȞkD)Z|jT/d-#}4.:e> L7\Vv1Bv(Ys` E9ޗ~hMy&|^Lcr9rOO4jVSDUgumy2BMAmCZhFd2\`X GtM3'Zb4K`AOh@v)-$CɃu`-7+Yg݁D}G,}gLծQ}L,3] 5|N`+`+<\;նoiu^X|@r N*lpe OL$2z¯_2@'k38 (v #^X[q{S63Kd~bm-xVՐO XAx 43 `d$I^=ӨkvpUHD aiT$Z)Sz5vjGR^GT}SH3'VnJ5A v`᧡^V1L/?5ǽ= R 4pHS.= `J?*Lw}fUM;܋?ޭ̋ygm/=O;fš*b/1^sGfWo[V Жॄf\Nnu{ǣԌ',zt^=&k*-]h #%O5~{ß5/ S]tB%x+†Å[b=jS{^. }gor]0'w)^c$MZrh .e/})grf6]yfB+)T5uD/32(1bwr}IU:VL ɺv:p_x!s # b~᷏?[I̟}!Z g{g{y 5 *O!qtȖl_)lJiTPl<{z).9i K= [6OEvwn1DaSŹ-gThB#-3{P Q؇1u+J- @ҽ U\8f: 3a[v7S0Gr/uXˁ-[p[pTcl. s]2LrXgLpLi]ɸնʘw>e>m3.O"'~\GrCh 7 bxƂ+pnS;<~/ t/$3ќqJ =k]fFyJ\oaK2һ}y6F"k^L71.4@%#)pJ.\s6"hJƧKK9\5[yjp$cXwaAT\'aEϽ{=Rƴ<ۘG1JR  ;G(VL/1&+ g3_:ŗ}so|R΃.)z2^blc6&O'G7\ftA®=I I=6BlG wVa "p*<6ȷh,֧7{af7^Bl6qfk1ÔnQn?4hqRO 1j+'ďWOFYx7yDt5#k|/'Xc/y k^~ dWC8T?Pr^5Wxv'aߒ2z`ٝߞї+۞@#~\rЌ,Ļvκ%lPA VX~rr{YKcL2՚c)o)s5!`4g9;=he/KBh_sGV[SvtJ9{ OAS|6(at7>C`ެ!jɋ1=(,di,j\%K 8[k`E[$4V#yAy@^y1Nu`FX %:Ns}zeJA/Ue|`fUktV筲+e/ !^<`s8 Yz B-x8?]k|2/񦟵i 6q#鿢ҧ hR.u[z/.Ȍ%Q!)]Ɛ"䐚p(JD8pt~uݺ(x;/qM.惌lg]POtTMs{=x3_,t^By)\&M Oo qJ0 Οy1@AS e肒54(K{ ([]y(*[4W1/pkإߔm*rN4:{gบ6N TP\F-l!9 Wܢ)QZf2Xg@zCVཧH(aEIV"+!:74Q7Ъ `9pTaؾ-v?GG` D%b5+1{̲r/P 1h*\NDb/ZF2;]I/˼BK/sԵϬ]oYʜa,ŨjBKƔ)*rAFq 9*84% -e<{w!DC,XB9t,@)@P㲨"8҉l)ӗ[O'ڽ\, 3x}g_]_ n/jW)SA˜C%PTrBE`T獎ʉ-J!s_ p2|?9pNE@5 T"OA.#̿Z?δ@ H߿~d;><4 $#t֠j o<9KOVĤ" 拤A{J H GQ% y =`gD$wXW}aTriF0VT]֍ RؔN$)i&b`P GdHU6jm)R%4qSQ[H>%^+uqAև2, gbcN噗S1a@AmXY#څ C .!IW~Dv$* pKI׏!0D&ܐpC|'q8%܀2bK1V'X0CU 8l}jo]6!8]ʹy1x%Rt[0cMhb&v o . \Xr4R . :) 8:)ޙ;sC@SݭXvyFFo*ec,=+C<ς+m9 clC܇\F_^"iW bߑR&O6&RUw?Fl}-A|w?hA>TdզA NT1H uUeJOpNIyQy:*[MjdQd%xMZLdϖin8K 16Ÿb+B:1 =53.(e ydvƥEiKo.%\ݠd.|>xhW?@B90Eiw_VRM0fq5@2_ 6] OL|P_Ig5aӜ3GQ j0e,(c>@A6Y"t՚q!+F2B `LQhJΧk[.5';sB$nM ` !A%PRFpIM DɷLnA^\IjDOOL[ .A)麫sta\6r]wP"VѬ_l \.`B0Feug 2 @7g56jr \PI.hQocCm!0yΨCd؀"\7;/OK@0lFlFBtuB(*'JHj rص',8RENS)m@\roZBi6^ [y6Vlպe3/fS|bZpp"u:b1^z 8YB_هP95ڱV$utFSL!,WJpa7?w4ܙs2JT`2J=3rvRpOdFAҀW]{'hCG_M^p6a%;J=tWĜDX5ɳ<{Jp|c\ѤjF.2%)v2*V' bN]RCZY*Z!r X5r-,7-/js Ta럄~ Ppb:Չ^#-ebnW6p3=mY7+*+W'Y11Q ߃5xHYzhח|EБLѐ5 {y|8>JGNSa|2F<ˣ>,܍^XNjchQߔyS:É]lӖKtׁۚ "4jrSJ8: ?la-$Z'/9护b)Bvch5) ,*DMS,ERT'RC!-;/Q" 7Cl3kTMuFM)괖ʀ\P,a.6>45nuheTl'iU*~,!+M>-H`傀b&-Kt|!g)#k"#J(!)!mOhiH}} h475OarwPɷy51RcunA)XP+2 UF&3.T%?IC1d{b!(pχ_KG:{{!n0]i)lp] #͆ ʯox9& #^' c%ջdNn~ë\q۫I.Qڬm(sKe=mk2`ⱊ L 0~ǩit JǮps&/xhݨT9E4[JxVymq/WWw"8EJs89h̺Ҙ|Eh/]p@v9;ëRXe糴"y[pJ++D0T,P[DRVJae6  r}> (\l5'Qd f981 1N?Yr]_[}MY xpu{)WQW췶jJsq)j`S:;P)yAG'Qm~sFQ4n|%Fci 2֓p| _ǩaplx[ nEU;G ~W_oQZ'- H;xx4#{U{w_r׏P[35WOT~uvt[Z^z{X1m%onoD c/HnNHTVFn bo-d z$_rS0AzArw=&ywQK̽F WQ1J)O!y>*gf~L> ߤǏݬWCu#t nf_Ϙ̋3|Xvu^_ nnX_Ϧ?ydIuT/z߅ɯw JztsC%8ȪvQY{fmOq|T $5{'>T#kwqC?1[0-o\b4qtN=sT;GwunCgx6]^)|q%]KL+Dוʼ~%=21fG3ձ>!~<7w30DlG׃bs1oanMy IF8*NÒA|L#M6D4ie6{>J5?(ac&9 \hD$jmvkFc$-➎$I#et~\HLȼrI3yTJj6\c[!|qeQ'7bQay xӍ`Y$c #faʇroKWjhSung4W9HVFD3?+L'tI.H)OA+ ]X0*SPq *1PiջUaJ=|t*$ܔƤ. ]#mcS:R-|[O8O"䙗&Eġj]ctw2ɧ&t@J~Zip\doÌ<\S4#!ǖvvo0O,4=`:|FSJC`љF CTJKt[q)RgGvȾC[̋7)<$B=;6J&q(ވc(sl\??zV neKߡtk A9D.D((@sr⌢z  ɅsF j'k҅nI^5E=!|`_룔tw7?4#_cSj|DCTΤN80/@ Y0ǬU\2R8oG $gb?PA r#BҏҪ z^{E}.֕ɗZPE9.Y@gT;"6fe_į9,Jd"3)9yBU#rMaUEPNy^U*3 ¡($A0RNRG"h RgE(@[RY `^B5piʭgl'"d5{ѿZ?X(Qt՚q!+F`Ta=ДI)KQWY63hlHʏpT Պ&\ Gp MgJ rAg˫)$p;o8;()^J($-DGrAhhG[7w0bڳ{i0&iO~F#n_= >S2WȗxeYM\Sn@u_%v4]gSG<@IP?h?s9{?RswNE1*‡-3וbA1+;(lv?t:2e8K5 ?;[/zgq R4{_Tut1eܵ<>4gh`pC;V`BM ݝ!1B)sĖZCY7ed%ҝN|o[.Mt T}l$zA3Hsc#();9/NQ$a+&(JNw% Lj\2;!nE8Y{>`gbޞBM/s,Lqcxz=JHcK韎FA[+M=P`P .uK8*DdGC#Ov"䍳h> DmBCF^O{t@DR^fWZm*AAoJ.V옔hqtR#i)Z RzЩ9rI@2Y/8KڪSI+)%.+&` ++Vk)|^VhYuq;Nnռ\j~\i>Kڴ{gv-/H( ϴa:W{]V+.,kw_Ji 86M*a;3vzgnF&yCM/֮-j5ۻPwOMg;}?/OqfE%T$3d,N =ƆqghjyHJɥeAb8S*t 0#hT0b$4osp6C;Z X!?ݻBN:;U޿}w?%av*p&+nAhn9G~0BEPdta,ɜh?=-RkPTzW~{:q, !L7w!B(t!B؅hɻDŽƔ$\XbO5oFuAۉ04E%Bڝ6-)2{λMͿjPI^oT 2avoITm?q@8` KqR妾h0Ϧ3G_V8z<8yc,7H Z4 (/ۛ?|^ڜ9]Gޞd:([8, !"M`e(M(O/M,^ /yHȠłBdp.VŗţHdוueeV=MYC ĊmG_Ɨ54j,A횮<@03cuޏ2>5V ,-Ȍ cPg{7/ݓZaX""GAj"ILLO=h >\@E2C?]LQ|Ȱo-R W,Mw\hgv4½dƣ$c[h7w(kZnvjͣk~ qMq%q tsj'O.w8M"-8⚚HsG{Ecby1ȏfMT4eF^iڍ+齥!ҜYsDŽ*_dZk͒9a3VlN#e6o.e_4[; _G0NLө]l"/󘳈9DHYAZrL*Z-ϼ-m>2J<L)n_ 0[v@ gLwLTq0JI>yᙲy s;P-r;"Ys]9H1  :Bm5uY!ΗJ!ߔ0|x$ a!{W 1jT2I"` RLkrHW7F5'^-x"Hb0DHwuYt2 j Uqi6)gښ_a!S.U`oSUq?$phqEY^-4HJD`kז8_7ntkcF_R^V^+4IW>r:yy@k'v&/?dM^"OUԥbHIf ꊙ#Pc ,D;̈4 0໋8|k~y>o5MqM"5蔯珅Jj @gk"XY|#=PDY`?՝k.Ӥ8J|bs#1ȯzp3uݾQenN[`b͞> \}w6.LC3V0]cDcPΆkTN#&13QDt(OdDX@~QX41/EDdyd]y6ʾ qA?>fMooěΛ|McN7^Ѐ! ("Đ0˱3B ~Ni\?.o[?'|H^pOe$`}59\5:7$GӕI%5|+*j$GJ~~ke%ѸX)eŸ@J&Ür[GRWv, Ϭ{H9xKjGUYXL%}rA` 3uҏkYIeZ0G(b+v$2I?(]EcŜ59(N1'QގvVTIBstVQkb@ Tx[.3.B11Q>|w8fi* UŰWodjO{֜F;} ^B,埏0FN:&%#"~(61nMo:v=)"d<ы&KQFFގbC[giCE[DHo}ѱg!_* ZK(QSe}Pc_ؾy?ȔD,\ezxѥfzRO;$q~g"ٺ _7e; DZ}kL0HZ%׆8-/+H,TL Ee 3XzXbMUìaf.Pr똌F)DU.)l`)Tdom}E)J}\(E{]"kW}Ak A0U<*DR"xUoB0zg U j=K!XBU+%2o00 `f8" By z10˓ [$)$`sV(T'fA@vG(818L+ؐP5>=d.C\qz]ԁ8 4j 贷Nn >lET{-2d i+k11!_2ֶLu$"h}R3J Jyo%52xi{| S>1Ar1PAPA8СB/:رNT&jH$):76DU7RjXļnw1Yi!w^qrSS*UU&uߜv<ݏd:-^H lqғnFCIόޙW| 9+˭@uȪ yVp +-c1+#M=)Ɋ.LʜG3w/Jkqܠ ;k@*Fk)E 0ë=m^"9! 0|ߙߍc:[F-W=M{N =oku'YaG 8!cyh"3UNsrHLՅUO~c$DIum9Ӗ`Z)GEB"8VB'>VކZ}W~&H?TNcw7QvLͅPk&³})k"#UD(+2'Gt8v1h`JP$VuPpӞ ܺ&z؛́ v 803Ln,>kdt7w.뭇.}}ʴ7{o)w=-^&9?4^鵹 K&y"pQ\^ 1Rq9sV8rfSx}NA\*ocFzz,1~ S]!i]ո\U)WLT†DcMhVbQ8N9GiCgߖ0'g̱3zLP"ɞzs?qo_AߑJVԶiɷ7rj6z>* 0u4@LgW0hsn3UUR(Q1 vW63G9CUhhm8M^f`qDgZƨrq$tC\,|)j/ NY;Ӽ|P)S曛XUDNժ=ꗣBHwfW6 {MX0 ,wTZF %r>bN^@(([b>Yt ńa2_P<S0O*/(5Jn~e KL^~zɰXiąMlP k]"Ս|@Lwj-sObWM*s"{x:J7ydwvӣ3 ~ʼp5{c%4HAFT5fRCC&kjkFHwwvY;VേRx7ie0 a]= 3<9 %DJUXא4+Y8E$xfBusw/hNYBa|j@+&H/bTK `ܽO 4GnaOlx:}b jtQ|֘`˯1ĢUldO}cnz~a!`Jz>KB9\U|6K tO-K F< PBdnBAWэLµ9$\Wt|~I',U jMtk {;L1(pl(h8/G"?n9+Gi/kk78YiXv뼱mFG^+wrZ(@.:Qf`!ŊԪT5eMƵMÌד)fvŏo>i Džg7ȬڌաzJqqӄ+3=m4B rͯ\>Y6Ny} -B5I&V n!ʿD7mm_sƇMOv9c)[f%^?[^-ܲ'柒軖@.f^xiYŇ>~i$.6'W>vw)>ҵD)W}KŷSb[WVf+|Y޲Ξ3BuLmqj+=8RV*dN's?v5{5K0kuZGgo Ⴥt6[,]|CEyA2(b0X@r>+|$Wj6hGg稧;>NJx'MM&'y~çyҬ#LSkc7Ԏ+Դ”3zd-B Ǭk;5")c؃F+6yuWn*]QAUt>D ZrS쵢i0r"RX\rT[Wn=N[@~X4 l7ݦ/}5]eN7hHkza 'f5kdw{}fgf&U1(XNkLWZKSpއ53 s90!ʒ-H݀cf`p kHkm"jdZ(>QJiA<, J-FU?GW48NApɟn1E+1aJ=,$N>ՁdDTIQ/"uvBڇdp)Jpto՜O p~<˻9iU"7?X>KWsR$ߙ[ѓS?o,k3K#rd xtx3xT"2/6YH~Xm?myCN7s\67߁ Wvtʚr\vz?&IKx4(Χpq2f]o۸W 8gŠI@O] {[4]lD*iw()G#򣩾$- gCr8o޴%a7`k}ur9Q@Em~gg Οy3S[8-s6RO\IzWo͇I(y4o%'/:Jʳk'dt/ xN݌;v a#^N9 oҝf>e\Lh]o~=45$N;3>by؅mSJgE#07>0wtJ(sތ{΅mO޿B텺5ݻW/orEYy+@ˣ+3{x"?̅$soeL-A!/-=IGQSΎwW%s1 cSvi9 SR5^E6wь4O`=fS47LxEl8Qp'Hܛ*tTw q7'+ v6Li9i02$nfs"SG 睹.j \S-]ߠPxP|<Όו."q3W^JG%t̪`kUޚVםtVK\B?ǃ$ӕBf$0 s ̮WI$̊:R~6Tʞ>FaC;: $ϥ{kE 2`]~ꬳvJ}X^j/.a_f^g1U\sXEA8*[m+1q.1.(ީ['=f[ u5˟ˁ{ 9ĆF]I/~ձKcuٮu5:]:(cu>BJҫ7XgDEZ~bBW&ęV]WzuRH"p= HsMm䵬kj'#Uzxp -8x65'յ@;VĥΑV#NӊIGjϷ=*#φv 4+Qr#~xܐ1ن.r^fqxҊ%R۴0Gєv f"K: БuģQZG1r.7c&)X-<=o7Q]ՎĆTmNB YfՋuB(.9uvuBkr(NGyk2O0)Gh|`VT1N '1ZgM (*%RxrJB]Go> AD<=0qfIP6H 5onaȝPz\8:CKa<@,M,RKN FJR'bˊ]:/]ފ:?=5YqRiPG.4PaO-gl$JUh)bYWMmgy=Qw3?άV#C`@"/ֳ퐜JحNS ռa8J|y`Y 󇁑SA,55`|W5lgI~<2#x.abWvի]Ŏ&8N{G6Aeq#U`Y:3wE&7 ]¤gjj@$J]Z3HMbƉ kQ3/WnMhrU3,MuG:M!U3Z@*$ܚHbDN2Ͽ}~j.ņ0Q(qBm|akUރ3]]e!0F&ThMcޜw:#[Tݵ=*uoC.jܿE|/"^V$^Uiޝa{PE fn^% AJ]ȏ^aZ.6;-49<{QcwEC׷lbN9&w8St3JR*.O[cшb +1SWhʾj//u{vs1̣0N;]n:HDozSPw>#$B .%> n AZMQ`'A[IjΒOPGIlZI:_M ߤbtB8A'${H[?nڳM`N/_عIbv}g:(?݂(3RdYlٯ=Zˢo|jC aEv\#h|/M0SѬ&C0s^O(u=F"Nh 08>mc^b۶N[/pyIj}vQݣՏ|ixFkj>glB^Ob*k׶Fad'W؉Իc'Xn*>Nf|"&FQA2`T4ҙKfXy Nަv^ յJ5J#t-"Nz9AbY~Қn8(`^~Zm}Q| ^(N=l_vΆwP 1BI0bcLJG(v^,h`+}( ܾg>PRΠrca~b:3 ˊ_X_gFo~=4KDwf]ȣ-(:Xb'_yeߨ_;PhFfWҩ!r~~ăh<3o|`74V#Q=Q? cGˋ:+׋@ś8U|mh8zlۑTM/߁1B  ߎFi ٢_Zξ}Vu\&qA Ji1) 1M~W^x"rE "];Pܿ{ T/|r%a. x b9i|tfjYր3%&>?|Ʋoe.knGI̝8W3f[ D<đQ \_:rcNտzMګtfss樴v,ź9h/N+c+ qv9\3PP5.s?F//\ܒ0tW ǭ"/[`#qeKVcC%_%7%Cɹl|Ɨl|ɵ|IN>]XKH7pC0#ӻƇl|Ƈ|ʇœn3_AjƮGӌ:J3h1F˃d,AINifab7H c2H] Fwf,)J`gZjC T\:mjXgJeU(غq6'J!vș7ahtFaC2W( ǥY- ^v#T 8=CE]h0i^v#&w]%GrFJAPZN1QS L{݈m+ʍض{r[ښ,|0J+EWbP|~&X(z4&XW/\PȍiE+pCq]Gz!v/c!ƺ&7KGg6Ҩ"/[`#QƗ&///)[6dK6Z[5_SDaqCF sߓ3%ta/Yї̄4VA^汩x'H0"T}~:I`,q'Ntx i Z}'RRrM4uW6MݕMSwe]ԕ#!U=x$J3D5R`|p C˕} ($V!߶"V2&=Fkr"Ȋӛ_22Z$(\/j%mo:"D ~aZ ..qct@8sq| E(z u JCy,ݿwvDFM wr eA~ FlpIʑh¹.[z0-{wtK &c]\uR"E)"F ȴ‚a(b@k3 `LEQ,u{c}hDB4<B+DZѣg]RWU*uvU& 惤C0^(hP:v?ǒYRG6Ȩj:O#ɺ=A1M(R_/8aX9; oV&H0e9Ҡ@I$2mAccuEB#FA&K+Bhk~vx{dq]\*"e<)HE5![H-NR.~5ps/ip8#PCl9z6ͿhE2~9= rkjO͖ڔ.em*U쳠QKGGd1BR`w,i pD !Z5,DT--l}`sh `aR|2%_D nAZ“h >ALJD8@ qM砮 La? x%8U!L]?/#oҎ>|0,3ABGNKS-\WOɿ}RB[Vx-c^.GgƗWHXX-$0zjt.g'(f~&Q8v>/NBMo~=H\3^RZG_m͂4OjAT EESpͿ_Vt[>ű+ a`8*e V2 ݗ'Q}9O9% P)W(k$0J0c( USF-]1r{*zMЕ6_m;-Ҹ S!KOJ6W۠ &&vƠq[8er@™>9z' [re' t,WuSʣusr*N),>k׭C.?:,n0Tצ<jՉ> ڋM$cj5=k/_ uBCtW_Wn\}miDWiW?}T<6Q+vN824G<'a j~)l̖߭"T (oG',ц8Ng-~5.q:bkRj&xEw Sb [MR20Tj*/O^ {C$Q:ˋIMᇩꕋ}|h>(!{:SczP_p{f?V',*X+ P<yקn]iY }B.ӠiYӏ}wa8JW*]S\U)f/paad6jr"L4"(9FFΕv32ޯi5KBJ;rc/Q@|æ<®6'͟&嘽MX&t7{؏/CO&77RMS`i -)"Hbs.aQq[ Nk%RI0=P,ܙFш+*QgE&bw*XUM+]![Dx$GBpkR" S' ,CF{ϜE Zq,\jpDlk\IKj/VhNr)XTV|rYx?m5BBw6H/>R6&ކε ŭL+zg)SrqbṩWT]Z=K udJbu`61ڂuwR-4M+EM`襜b NU_jY gA93Oμ5׃xrmMp )ä5qNat򬖁{/K>Q*a߮Vz|=ŠNa]l? "y$!~z)+T9ClyV9?:ԧugĦ֠bSt_{g|U'Vlq>(:c)ƥ@~sL gc8z)=+GmUꭽ =ur6vS!xˋ }nYn;d Όc V /0 AM" W)*7\jpC5-Jh xE{9D^T2MD,=7@"ٚ?CKn% ZXZ@hێ?^ SRdy4DZtp+LX_&g (sr1!1 $x}ֿ ι'0NrvBbWW׈?z~5i3p)L_fLI#ЊʸL]SHf6^Z l[j[,h%k Bp񞭰4sR)Kּ+U7Ѳ- uYsޔIߛmiw[/h8LI}74,|f[RxC="M%処2gЅtCcQ e M&.1;8R-ZsVXY QIzڭsJ) =:݈J)hSl3ژ s̃hn,CQO1\fj1Gss? bTjsY'&P笲'B@D!;VYcb-XL55Q0ǜ brP"DFS/Q/B~" al>㔣X#>/9F#)V#8LDY H$3BDFj5P⾆f!Y`TrK^D4h_P):fՉ]Rt)ˑTsLR1ELd>E)FHE4Q<C~=eby$v"Qqp~ōhWن]"9jK В-aZ,4ޅK?<?.,vGV3Jpz5Yd33}ZfUH-"~L?b1wm=n~9H-~8ރ 9I d_`9ckVػbKZ҈n] ͺXU,W?Ή‰q1jmsɯo^>,a_}=R'&JɫpbiW ^e΁PTSL]@?Z D(+r|28!"7v&|TZG2ks@2!yh [DX`̘|VDα)pR@,Tb ,/)G4#)|W %P HS Fh+h"N1F9TW#a8)<^þX= 9՜ P>|7A*-yljMb?$3))#H) -etǧ T"8Pc^4ĥipM~ޞx&ƴ4 3Џru $d܆:Pi}jU(P6_aS1_k?5̥GSNNAFNA<@)ss[!iXU.n\ޡj*V:_ ߠscwIjG0QvԉEyz@tGJ md]cM vҀ2B7J:' dsVUNY+#Is 0 k^%,t%5d'` *&QHů]Cml]/󍠃;Tɒ*6A|ٌVdXo:X HD2=w&"'5Lv0(:= m?5w`tg9>E#ОGw+J+hw0kxɛnfq߶YX4i/`7`0}=3كgTA{:<6l?Ԅ9 c5iaf3zka[ 58l1`Z+ҹjIzSw!$ۈFKX]{^mVlpwc^NxfrakBη'Zr-f_'cm7DX =u(;kB3)`"]9.5{/+{ݖL] q?HD27<:oֈL}({2uIa,6Bs"5.H!*g!9~_.?_We&'W)b6iwJ9Ulvlݮ+p&M7lnٮ!#*#-,_ox]񽣫 ڟ֌{ܰ Kd' y;dnd^5w#*P^Ƅ4C² 4HHEUT* 9!-Rq.Ƹ0 ̕ b|?75-b%-Jff1 ;:l'1GzN G9<&aTGM&/ (܁R9|ĥ}W 6zHV<^rzOQ)fm'8!ݩ`Vv"4B~JW!@w~xK ~퐀[4}==QJIѿQ#@ciŗ5Ybݩ"5;~\#ծ9QÏ50ŏ}1Mg,5JԸM7JJkvkk+^}}a5rw儢5f>6(#J!p2ˑ+2e9gPD+yNC9'; tI^,n([Ъ=mzZ{3Jv=+?xQIV 85 Pg+1L8^mtrF}?b[e֥G >WS_Eo&-*ؤA;+ͯ!LHuᬸ.\CBI4-qo-yzi$QI&8J`;xS2!Hxs %F" 3>̥5EP˾諙x%ܡEqo3 q|i{fAac^ ;`Bƽ_jUAv{e̹BQ,95HBTby ;U/pE !MI |yD"=[(7!pըy[bZ6oMah+排嚌 0ՅP#"'u2%N|6f@%.̈́؄Ԣjl8Qu^5@Z0F}hj> &k>%}ˏ6itU~@ B;D<\[> rT]匱qh+.|h~6hsQ\+6+x_jPΪ]I00 L`6m.zФa Țŏa V>g]}HI!i4>.,H( <QϽU F\#$5ݐ2_p-j&j~.e2C&ExOK}lsٱt5~%w_ ĿwNַ޳m_Za `߾#bVX-K'"aMXb՞ THǽ͵!oQx.J &rq kCmsf2E,ʘa4ӆ̘ɂhcLʳA;*$$Pm4R ~+㕲Sj-04 >tg#C[2qWRPKjz۬c*植X )}~ 퇭KFho`b:\YƑ]ufs~9Jg"`hb ؈`9l* 3c/^+\I{j!Y^j B (j.wxp)$gȂs x3ESRPaL*IX@\&=#cfZ'cns!p -!qI,V$UrI^(( ّiD9$->_+>(]4;LX!`*}q)Ll:S^; U9$%ؘ^>b(@+M!8}Rv =N):&h,v)G*xǏ  H{%oO=S aARv)M.+mS8rqm?)v7v ɛ-?Lί,|;k y{2vJ-Y!U {Yx[U#0x|p=^F*M=a?vvD"{AK6S<\o" QGhNJO# B~֔3DFSրpL†@Roc'Fhdž3ۈCPwIoH!lSSU`O0):hqsc)l8GHjOGqqir{SIHj($11;G 8";Ǫe>j@AjWD04C[sz&B_T!q lge!_ÅD&4zЏZk%FY"T&"D`l 1xu-˜SY>TphO: E[3O05 X$`))`D m#& pªdSCPb1%>Cg)A<).'v.Y3Qӆ($= [Y5C JaY!N0Yw6YSKdƂ`ji`a#JDv~H!pg{?;QniSAs?m١p&H_Xq,*P禑rв#Ζ×0F&KJ SiNSq!B Efy!㾁:X] r0;pJ6T~JaX?lSϟLkj,W%L\?foK;7sbEb.OLm"C{=6a a4o&LL=@m׃׳Vhe2[d%I݇cCd׃7["#p82ΘS.Sь)H!@μc4! 1=&VH1j<(yh=ɰZp=Fк(_]Q ŭZoFzgZ7^1go Tw`vGf}fn޾up" ޾;x('xBK™O[%3gbX/J2Kѯ %RK\qQ`T1tV/~'tQP)|./a4ܷOR̟1dn4! _I 9[݈B.[TeGqRUvz9V,lTuO!tř{]澭+쓄p] ʷa #Jz@_~ HT $Z/ dp_b.2`0]E%84|ьZ3 yf -THLk|^~-}:P+>睢TNs uRbW ۝<*:PRcd u:R_&+"kHG!xNâȴDj6K>L}c3ر< Od&l|쳚=>6kprQf;kL圎i;Yb7Ha[?,0~0.#|p) @HZ;pjҊKm5x@hit@QpmFKt3zuyzW( 5g<vm>m,<;!v6dL? ^ sf}2t`[22')!SDhdaF8kצ@~){\m3~)tUPuM+6u3t%{&˔fбX;+c­\L&9+ <EQ&,kN);E:E橣cSM`ؑh]T"uBJ-z˸0$8Z\L<jFTZ# LR|@maNpw49x)F dZ ?sܺκfWxxyf o|dQg*IL꘵eّ-Tz'dڔǐtcW hB˚2b#k .pc.X.Bz{ 81=5Ǚ 0󮸷f:;ԭٚmHj"6L ,u"#ᑘd2YfUQ8ХxTsS:BE-3A f4pb,H6iM T% @蠙;xgZ3t`S+ ~2Yg`0˵1,hV(MPzhTm ݯ\uT@e{Ґ}۞%;ޮ$t7jfN ^P;W{JBkƶ@y魜Iq9Ȥ#}íՃe5sTĽf~ͦm6C_ÆGJ.^-R5u>rzf9/ +qt%C'J3iH'tz eJϩ.&ɳlDj Õ E:`%%`)%0yq#]I$*#gpl.mڇ]N>܌JLYIO+W;?TNd}t?拏7,{Yv鳏  E#Zv<*7my޽&Te)Z`Gれ yaێdll_,=}d$smQGK^»U7QD^ P &onvqs5 2K%ݞnś«x3.L* I ?ǡ/US,nǝ(EZfVL9&SPBX,ØBe@ta ;.w%FaI7FN #SMjS 0"*)` q>?_|)򆃨#krO'ka56'I{ݡ5g@w؜$Qh@ ):*@jeUN%w"kR)a{:x%[gb2j{Fm=o?[l/oùc4[ l@JoN9hFJYʏ__ v!'F^03u$-ս|dF)(3@ݥ=4ܰAeoQ/kth`4c[YҋM"bWӻ0ksF`;F h];_1K92ci쾸‰5.Mݦ7g~Ofѯ"}jSx~YuyfKL|;t Π_—n:4TZ @"Se!UH,DO˪ƥV:xRe0okg<-eb@z\ZL?93n(Zqա3e@3O)ՕL[%ۄm/{fbPqtz)xuGVU/^tK9 姣d/{:wRhfeS/U {=Y}:rsI {Rm $)BXr[=Y%ĹlO%//l1 pē: oG'UI(R#:(!L׌z +;w/:|COP'yT(aߔ+qF⢈'{rKh>RZG^ʣF-ԕqiŅ84>$s]ĎsaLaQaV6~3#:jt]o}85 ?Q]WAj/G{ y4EûA,W,ιz3Ŭ 2f}BkST?#6r&z>qoMg}z bgyTX^^݁3Nj=;lD0Ѽ6_^Q7dbw`sv 9oA8bu0NͩhӮcT',aMz]kзOJzUX̻.?]Uҵ < < gh &)p0Bo[0K0j WD'x=Mc^ 9r4Ld[b`R{„T,v_:['2RՁmK0V2 L%0\AzʝĥgEZ"&tՔgJ׶]5}n1 'ȥuʸ Lyu*%HS3=FH%V (6$W M0@UQ9&dk:8y\9q⓸G/X\_=q Mý^b߳?+\p-<~#׷oa2!,ِ_ 3gVcz6?_]aX IoMo7Wok*VY$ JrQDiIleX2ɀXn!T~Z,l}Eq 4{U"S% qiXcِפ⮪Ub1j3*& dywp-._^b:ruz5'_ƕ6IR N\L1G^%e uG5aN. >f6?*MITy* Py2a[{]_b K21ى Ԧc3qf32[VPjR -Lb_f+\ q# N2XkRЫFXwnr ̋`*g(rstBhw1Bɞg߸ٻq$Wa;|?t4f1n{P$mlgf{W[~Œ)J"t:eYd}U,VŪl X(DN2f*SØq*[ux6nfO}?[Qyj 2U*H( .(5^d$}ŞW%a;n@)KGTKᩳZ)y¤`q#d±Ant/nm+CT H>׎f P 9E133xjv0Pᙍ(ZG%uHbem*SAM#))-0%.I1Jc20Q3☢!NΏv|hd65pL6zU2:8)I8V--5n QQ(xRD?Sڤ(j%LMQ4d=KIqu` }Ƀ2ww%*hJ 1hN>61V 23?,Wyȼ =|1r(d6ߚKj,1[8΍Vdg>QM0%k$);3 gd'7, >Tz\K^Ł y4g=(8*^BN~X1@҄*C;15oνijaTt|bKSx,ܝ~CdSjsɏa58lGfz2nMݒi+IgfPfvt1l8":%svko?Shvx0CF ey(Ndg*Ant‘" %@s /C‘}ݿ֫W"/\h6h1k=ѫ+(`,/,녭R <2n$18Ő_ ɏOT,n5_8g{BI-'3T(0RryP:vV&"@PF Je2Ġ-#QL981|h,A a1LV@v <>oA$Tt(sL4$MN;-| l,´NNZVP+aUnȊW>:c( ز[c~#7K6js#3mFd@v|F8F IiM8xl+9g毘9ӢEТϡNUT}Aj\kB))88-} gq(hʬRaPT& Ux0Kr"V QejRʑפe}ogRDRޙ:g4u3ujH ϕ ^+72uTi*IKϨK5w~]*J@ K:N(st0PYb̦P a dŮt}UۊU7,n*SCXJ&S+ 5pP 3}E,3h_$LDc1:9p MȘ( 5c<,.!nqj}D9 -%s^cNbh0"q!uoQ^ \Nomx1ypds~fὝ܍]f2 ! ޽WCtKv!灠axYvc4yZ9 {US18˖tʎQe딝g;PWmhjI~+b~j@PVw6FaO`y4VJ1>C)){F BKN)usoN@NedƁb*1:%['* kF{*n֣{?Vk್=hbÅrOqMO+E p~``=?'DSeM#ҭ @Yw5YPh'̽/RZ-e8$O0ZSc;J2S&+ybȧ]3n(z\ St^N4pE $$TTRpR}!%bV:VRd/ ^T-::6-&Z6l!)DNkil;;SLv~mJpjzhnhuydC.Q*P ˫;;ˇŮ(+ $¼EzP{dYxBST8{YvشYXE[q߽\FyQd|FlsEq|ղ0%eɄ[m"h7""P|J!",a:.m4Gbю|U]kՎ\29F'$jNv9в됐.Q2Q"B$GQ)ֈCeʊ&t$䅋hL1}mƥnvŠĶQEq/I Oݢ -ꐐ.Q2ulOni7d+lX ʈNlUti4e[E4J(jvcnJ!F,[j},rZp)~ BԬ#1(;Jϸ2Z:C} EhZBҦV(%Y1lz9'hmSQuCTTp#lTTyPuawi*/eּFӛ7FMa[ˢ<+𢐲b\.͑9b s.YkW"gaӻɺ'YB4OfMFY0_OnFwnUI FEt S)nX*%HJ/-R~7!z&Odf H.)t<maoj36 \$O~ӍƏ1Z3Q9&,/_@sϤ䍽4Ԝ2ZS'yֲWȤIHhCo)NɹҸ^o_WFޟ"ӛ޿݌}mnw705]a( Ow 'exċpȼ_B7|"_F~BDpv`Fm3/]| j'TXfdu@|qI #:@7^9dCR,)9M-;ܿTiƭVc} 8Q^Pνpp|̌sGvgal] zP/d寿\߂ֿs70*{ O#`\ "ZX!~g@sMH^1o!!Tw_:=|T/~ӑ A/:6sk?H[`/Y~.X 9:.9ma,˃Xb^Oxh4Hc~r1vf̌XNsuC޹589 a7HINLu$C{c$ NRRLOB9H)N朇hpba6RUN _oqw :,ToEPaS?V)S)S^πv媆wa,4 {ԍn,_Uv{ p8q/GX_c4;PLu@*xP%p PuCt*C(@; M$tKanJ5 MMQ]<|o-#.1ޫB:Ë]#fGq7Qy o {`*W`jdd Pٓ[V-+*ZpUq"ʅjkI**waŗ35K߳qE,cz|U.fe_?u @@5|볷׷z2~g]oG_C6~xs[~}_Gư'oB8q0_w7a2a>aDM;ð^vBoPj h9^e &glWֻۙG{Co%]j@:|uA,E%* `,XvU,.%Z±.h0C"6φIddw d/ls<8tQ]eeEY2ʒTf| uC>1ٚJcr@Mv,U?F]n& Cj*r'`]Q1FACh [6?M?8Gf*g0ڑORK Jӷ36'Bkteǣ/I#""J JM[&y&n>L*i--Jf-aS6\f`GXaCDc %L :S pi -4J410-M,YYxܾȗW1ێh;*GxO );mS!ieY NOt"e&(>A mYBr6Ϝ%sDBnuZΒo֒)V^e UWJf9bz$Li6&9`qz`Xp…bF ozH N{ %$m˴Ci9(z: ,u I5< ҁ'V0iLk,r,>6|cO$^Lx4gz啄炱%8 \[A0=GHxnsjBRa:.PiM7ԚJAXcx+;v.L# fmB߀p_iW{PRYC߈)@a#&u]z;c.?| \̚!6քK&!,`:'z8_!i=p>2xk엵aE6ռEJod"Y$LnVsK|`-0Cm}(ʄV95GyoIygõ;֙;uY/7(3db"ǂ_Uʅ,FAH޵m!'v7*ל>t KkU6u./i Rj,)l?))w8&$3@^$LJ껵Q>[uLNN_ B\mH)L,/aղd;qy4v\-ɇ]fE^\$#IӀ$$ V(H*289:哰lOTa%zkC&Ve򜫡\|e0/WidJv{2OȇD h>_&(nHj*IB(Xkqᙽr'pLEr\+gҎ?g&oU;7&:^g(FO4{ŏH*W巡y @%魤\+ ,EKܟ]. eZ 8\ZT/z:K_^'X?Oy6x>$j/wMsȿT=t𷎤I,i&7 6WRɿ$]SI ]ﭸe:]\2`Խ*ݹp%zYSiuދ\ݪhʣ-W5d$T.@.r;^s-C;=r 4ե@w{cuE;|XXWU@` 8_Å!V$IQJdĽ #azS5"Zh,T$1#4"< 44Qq$i:cn{3 -IC5߲ -ć!M-Ԥ^ x 㳈S sleqoc~nFlL@28 ?g7!-l] ֋tޜiĺ ?./Q__;g p;9W6¬s ; 9|nQh~-?:*,!aTB,w+nC t'e%>s14BLaI: AU<G9b#ުFݳtYfѵu$.qB=HE8&\ G` &&%ɱs?V<,b9 ̮gNM'Y}[  y6xn[K%).96\O_ptjK2̨m3BĩǙL|R'NIe8UnU{T$SrFƴ WZs#K%CN/͉LF{W \h ǫbul[&1", RδKmZ^5.]E4 k[?ƒ*iؘɩ#'"/~^HI//͉$J dr&25WVA.Lf&ޜ̩,ݶ-xP2KwpSڑ{e~%8ϔv5I4qM]fI;VeS|%XszwLz}L,{aqs(QЮd#ع=ѽQι#8X]+l5t^zu$S.]\>+^-ciڜAGAʥM˘mA;u)WIs$G0׾%{/KU"z֫KGj-e_:(޹id.]>-*M)b)T Ώ9{Ȅ pJ׀ Jn qOai#Z5p+G+aɵNaBjqҲW :+P v"G_ wVjռ|BiUJSv`GENH8ԗm-R0 9n͐Pq!͐m l^fi+vfI"N o%!T{r` .˦(06[ ZG[JT'^en0 US:h9Մx Tn{)^`Bi{$[y LiN/IbaX5LMw՗A>Q/- i弡S[-bL#6I ce}῜G̉XxƗ?]kYs3+b%`w)ˍA;19%Gic݃ZE8Jj}5 3-z62;Eȩ~ރ嬅("*MdG4u/V儎OF4̿ٚj-1~u(R3aj3 qU׭:T*;`C#I|UQp&v26/WZåNAd:zmNy,~NfQEm8\{5\!V&gUdvo>{ ))QQ f_hfRSOkr, 59#c}rCRCvWݛe/Q&d킨AS]~yH=n_W6InƓyBS}uFR+ 1E6ZIe4y וZ=5o+GYFaRJeu#0⑇խ$ћ2Gztq)琣 ̓:Ԃ$ rPN̓<ݛʴ^=[ze9BK:@P,G1\<=.pZO"tz0ڦN ;uݔpF5]?Ċ+N+"5n^`E4C!,de4{j ʩ؝:30v#~gs /qs.g{&].#>'#DioM(pn\dSڃB$Im;̓oukwk:.x"_W9] QBkJiĥ9Qz~rVCLPG=^6[V> ,E&̈I&l3 w6"a^HN [u_+E$'#W<.W6\ޏX:m({~sl8ϰ+G0C:J7:~iv>}<͹J")#O磯 ]4)+s:~ycƐ#^"u(,^Ժ?pib #t~\SuwPaxlĴ4$w2VRT08fAH hBE#ʕz8|pL0MWmV찕\^,A^$F~~Gغ,3WI@ H2@(KC)UaSHX%'NY3aܝZ/- z3_/INWke+"eQL`3T`ć&a)ZJWfjHDQ*~DR`[yP*Vaًqw_9TYO9 F~^ pDYQ nps )h`~,="SC(C6l,@oa΂3CBfi5\_9&3=S>I^gm>][7+¼ 6=ŀ7NXy20`2fFh4d~nԺYlR2ģi5+bU;B,D蠒aI/<= (ȹ7UjYnB}7B Z'qr4N,\R%pGR.bϞ!L>^Tcї= ԓ,-h[bw A`>1+O '8% CQN#XghRXqDګ<=܄Dv/eЦ̞IM~5o5/ qNx^_??p|z+?+~[O VTx/(?wqZښ7Oww^//oaQ ߩ6s3_O>FlH.|˿c .!8\)Oj?ι ~74#f}+5wk$L&h$HL?&D _YrxT~μlS7;/'?L}&Gw/{7dCpِ\_(r ɤ(8 B" Wz=waz+RQu 'fn p.D7u}Yn%2#e̓x)t9[qvZ^=MXxbúaE4-oLC,ͼ.u5u3.-B8=UR#Y#~qOj]/t6ʩˆ۲>zh;aW_~`{J=fp 5qmT(vUrJ?lS6BhH4%@\樥7;=YG {Ed(6j?O_($fgo xٵ,IGeBz&9$~;0:焒Nу@9{$?<ϩ"@J{N@:C*3>:FC5,vNڿA3MWs5Zs;Ԛ%rG| 7Hτ9Es$v|dS`XΉXy_|G`iZ7 @@ז-HJDmܿUFA磬 \.{p.Rcaio'r ^d@M=OcΪa0<|=80F`AB Re!- $3JH&Wwr&4ylEY7U1oy,W{;i[ pK$TspA1۸i. aZux2Px?^!A`K !,쭔LHe8@ ;Au7 9QfNӝPOcCz˾5&HGhknj0vu[>rH‘-fG.5ђσByS)"fcIR7?g/7_??]_ڴ\|~JgV:qa-ьDD+--b-+Kǥ#3K$NJ,7$k@)f&Pg9pc@!TV8GztFAΤ TpQ=[mYC}[Nh6ђ7[Mu剬(%OuOו֕p'\c@$vuQCkOBPzjơgB YDGB]h>;ԏ? wܤ|\MP`e#y\M-Zl^ i/p wz,\b!a#n|[ϼVr0Sƙ>&Ҿ%䫕/]K?Eb1ld[_R528?̈́Rar<b$FGѫCt]B5jP}*SGU1h%Btخߌ׾l}\ށSޒid'=ңT3vr?bcY`4h} _Xu ~$X1jC/#?30n9e7&[.\gir?K<9_2'Ɍ~V0ogw!*NQ.}ʍލwGW5.X;!w} I$e綘zײD.KQ>]rq uʣ2C ̤ǯz۹(yP,IW&O,;^[xoaX,v3iFl>kY~O!/on-\^|?I Žgk30HzD4mV QL1nG&$'Î{r!߿s凸!ƿ>y’Ͼ *ib϶>':=VC`o{mM7cLP=]ZjqdRARxap"}΃dZ^%2!w:R\B" 1LbYM $)t(ϣϦN KM,  t8N)"Lvwr}gpYJɩŧۇ|"B/"\5Ay8<0eIdG<ῼF1Л 'S0\b洓zsv&VJs%u^AR?\("Pdґ <H5j|̙~7~i4.RܖrY2xJcm)% BJ ,RR]r-Cn}=ˏߜ!g5'~Ϊb3)N(EkOO?;Q{J`a,\ΐC{KRR _I6ᝮPE JvJv蓥5E!%6;Vr Jj +) P 8k1gpN!ڳO2)TNavQF0nLZLx{N,NK*̯)"HE()@*#?@z C3!^H!4SV3"U)?6;{,߉E_F35춝]kq-_f᎛\JF, %/VKY&3vk_뻰:`׏Y+Fpt(YI:9 )gmnp޶6CV2w&M2q~n=VbCQ IYv:=XR$ؗD&Ģyax9UXӼDV[lr%)9+9qB"!K(zgh4蒚u}ō7ݶ}o7DIJ͔ݠծ ϊ}$4!3UX:.xJXL)Ui@L4 %^$Ұ%?(;.ZsVGeq,Y&t!JG]Q?KY9S ͅ'.avRxQw\iTT=ӫH0TH"AWA| 5BZ)T(&.A_B P{'nCM >Z W. 輯^W^w#Ge>z'Ur9uo;- P |z|HMn@}?eAwK>eNv.ٻIs#[[g]ìD[3 X6! 9pM ;TO`S=jTĈNu[4nɈj&$E4Iǹt0ӠGJI]v3PvݒnMHȁhL=}}@U=V=󽠔;Rn -zH9DFژ{<Dxgk ,Qg,?~Q6bXc$?4b){< OiDx"n+vy`ADXc[Ǽ#lx"èi71$q@I}:x}'/QJ|CVZw#{'+ P²;j&^Σ5lK~JY8⛮J +Q^-XvUd͉lQUԯE6a,GOeYӁ-2G$y8D>\5msڶkęܩvƗz7y+:5`ip@- T" b+li! # VԎQ?ӡ1h\YShBB ,Hl;mPQ‚fI>`U |4J\V_og65 !.I2%ñ f A FtRרz \tݒnMHȁhL)`SN4OT >zQ24O* bb5y n=OEwEy$| MvRi#:kn%δ[2ڭ 9pM)qnh7L nJIA$QKEO[-ZnFk$E4IԞiCM R!Z8g-uXQ99As#[R%%4)ɬJ aLRK,E[RDʹwXDDI\i :_|lD)}]3e@W_׸46j! fPacJ:5jREtJ|vl+Er"$Syuw+;Q_Nǣp}\MK'Rj t)YARF}P  HȁhL)6xgyC0gN" "רշcP 辝$EtҒI_լˑ#D-Gz'lsyÍA.>8ӿv;w M /lqSC> M 8 #w1ohyMpӜ1oHGR7}v ᓔ}rW1a]F,38ˌ *Z# jOc!!_af^ /~r'A"V\iCƾBS34&.T*8'"8})~6 $q+Q4'SÍQ!E)p,* xPQT!gE h26x]9w<J6Kp[๾8͗nV|oJaO鑽^4 Jn[Op)hplnJN~<^H42'! g/oǣr2_,;t2;;oqVߩߙ巓dq5~y6V_ɏ?;̖n)D`Dq"[u?f6w(g>nj4ˁW^By䭘}? oowO^zyu<疣טU_FI^|fhPW/V éҽb'~wR>{۲ KD,8pwburdGo=RXLfUIeoHlaKК[₷Vr&1%dn?t)$}K2XkWyiȔΑ–\V<)SGa%8mO)jQ)4*Α֨SwnmH8T"gc suĭ ? '%AUr_~X?5-,Qw(K>},j0J g҄"nK%$ܝ-i B9•;THĦ@%ax4#QJs 4q%VxChϬ|[:2*ȟWܪˤ:<壙]!^ã#v?ޡ J=5qϮGsIŗ/ovubU4狳i[M0Ì R䙶feΔ"\l^a1?N0YͿ7ֹ7iz:VӽcGE^( 9Xme_lէ"\"ͬsL -r22WO^LO~~u~X v`r/"Hp'~EaݟQy_Rڈ;xܣ;d<[^f / noo_EL [[zٶ^(TZrk&uj'xl9]4 %bDSEW8޻Wۇ3G1O^3#n9G9&t fĴ%݅$hn9fwecAm34ImKIݶA:fca{GGn_f,q8fY%mO8D jJX>C(as̹2E"޴9MaIޥ{ kJؠK4q(e0e2\$ԅbV9)KRo%0QB?U0zb#$1ZĻ  b!Jܖ9mJ\z4zjm֕z+,rmpP['_~? xC1p@o֛b[ީ)?x<xj{tQeRS)۹ -}+?`EâS'lm*c $\d$sMۘj;P*[C6BfNs|Ŝk<{Fq,3f(mY93:R^,ASuPԙ!pqG&;h&$`@$!UݨߴV{x4񜠨8 Y[ՄG5uT2xtNm4:gіO'{ DՠSJ&3.˘ypOׅPl 2`I̫cY`LL<95OyRϓA$yI?ݐxŦkf MsS8{LdVvbۣ4g}nb>R 9n#SG-C/XíV}+HߠQQ-W@13s"3u;D&7p_P'q 67PNf^ (u7UȷczvjgY9Z`ћ|4ɋ-̚2E;5ykcRQ^;K4/-&%9W!2՚pY%(ͱt9/M/ݬaCC.n >cLSTe}nPQERQ `NLʒQ_/ꃯ4Dv"d5;w/_qWFIw9&Z8(BK?O!a"֪~|,^ZwH@v RL)b4F9E4})*J1/K1\;|d[sM(|IR򆤈1J/Cɤ|(C:@6 }z6WpYАԾ$(ƶ~W7f/fg "6x3+I XqaܖeF0+sEkF;1<ⷱC$g|Q,ڒ|lyrQٙ|,9=9h~'5YS'G+S7IhruzJ`vo`6DZV"{vHn%uJ('"lg (xK' .k^6+A15}1+LSͲ kGtxVC- `ND1g~ljBbXc.EGy(CKf~P&<8D5yj/{m K/2иjRM*uecJ!)٤OcHIC1+J6&@or]p*A߰0F<1BZyсJ2iLLj2"8.JaX` x-@[|2{!cV!ݵ.+#v*S10bK 2JqEHKiTVj 27ަ6 BK\{F83=ChY9'k+2%cPr <L BA]ĥZRt#H͙UInblf12\/緌m`RZ2urNtZVEQ6y :r&ϥfH wtל)M3@| Jmpyߤ,!C ,oh(UD&f$kFވ1Qbm8N#dэ(Kɨ4f#;!=KHf #^`'RfrTpAr#NqBM"=6khm{2TiRFai#&Dt+ ERy/9F h.K!EKSùB%iI*ejJ+.JDj5@ E*ӵpi[421/5֤z W(l֣" TzTpCݠHA.[!BM๖ e@k ي41Ktht-mfz4&nƣVAKtѢ#룗gd`DwF"k m_ Xdo 0j=[d/F04V G 4LslL@rRS Acl$h.[xᔸ@67(|8hG MƟ4&ZᶹnLCQē 16(KV)K 0_*# X:e!5'ϕi9_ˆ\J++6g=/S(< 4G(Jd_>RnX޸B_FvaECf5/E#A|teB('R}%-QهRy*A' w2eAɗ׆><6b% !-=;'O!IMG৯XS4HN33^E3r5r=n R[KXy3`$+-!=7$sЇ7  Bn[<39rg 3g );ѵi8ͧW?RJx/fO%lcGU8awF%ҋ'#j3}>p Cm] v< Oڣy;ۑkTaG o䦸O1O n,:IRx] 6 ?p:n_||~{q~NA q{ >_?] Ȏsw;X} k7x7K+Z$~d O/%I6ѻ6,̍ĚjS蝁&@(fʊ"/@Z(Z֧wgG^mݍKΜ5`&CʼVWf>SN$)ϨV) _3Klz9C%CТ!JQh(˂Y VYøEQ1AIujs!426#O2)}$LWMjY>s0 Ϫ^T=UwGO {J4]\|fcۀow5qn<.fUEsĩx%U9ԯ{z˔2\' Oӻ~Y8 ޅy>ip \g0 值 6eRL n:,?iw ozɫST }z6*G9~^Mfs% dho޻B E e!Y/0x$hR*Ovؗg[5ԑ[q;?i~ ن$?1>[X{༫ 3zw kʼ88/ktWvꛯޭ\{L\{K{k7'ƼfVn~LQ8*;1#V M֩K̞‹fl-5e=un+[7)# 5poմa]u&,nsScx=]vvqgӯ L%_ \XN4:#+e,-K(%TQHf$LNqJ<:fIX4L߆}xdD׻TKbLe0Ӈ#'LFQ×GɅob¯~:kή p'Bħc^h Ł{_3G:![r^Q΁OCbĩd7"<}3uxrSVz;u\Rm-217LΦnѲ;9(dv܍=JβSk RZ_/ Z} ӄ.~fWc"OkTH٥Oiǩ`^_|*}k.'>5V.;߸)oٖWmeΆ*Ҍ˅ޜULqInUw/]~wv~oQy|y"K/߽55(PoU/{T4Y ߪ^zzi W9%bJBm4KpUrJU*zp%A!~Hr mjihW6?S5톋J6I jV91Rt#G%:fA|=-AbJePĂ(sFR%1)aa+vik6 h 5[DVɃT2B4YdfDB[PƵ Z W6^zWuӢ7#A #=Lƛoh&ohֶSC'i)l FҵL.^ː&S`< it&=g4r ZUϹEK[QjpXE^">,G6T9UY+H\҉s:5/%Y=G)R 'nO5@/TVCR\# "J)ufY{SQJӥUkaKd1r'ok\d=[e rF՚PF:ODq_Od-gLb =ʼnĵa@tcPje[cɞCO-Dz9a>t>EC(E?}j63_tͻ f]jQ Z,>g֝S]HL=.,mA9 DlcOq ;^gsfqpmW9fnuuV:d'zB:⑑6x[B]SJjqnt P $73vBbCВ*$Vsd<'' D xӺ,#C$gEbZzYE^+z(2ssC1,4Vi! 魃ٲdh468"G 3*2rB@9y=zjub!J{U5ͅs{~yZv+Κf j7oeSgGӔgHm׺ɶdtlۈixT7Im oy|8+ݽ aZVI#`y>[ϮwdzP5G~=p7jc(Mh+0J pn+Qڈyt=pdf,TVYi@8ͧjVf.*eo "t]Z]հ_h=q?%qkVJI0 q7NSb~݇ӐpCo=+c_#(Fo0mv׬"\`damiNrx~Y;{{,ѬEs^Oo[(Ĺ7u# a=f/pe˴C3Zq`l}4?x9SjͿ pC^ JЫ4l6Ǣ yȽRLy2e]>#hRko P#\hU DJ$¤ҨSC9e WE71绊 WtҝՍ۾bvwI-o_# Psvc#߆8 ( di,("#;~ލ6r#b.`>e{{>uF[JL߯Z=-Y/lr$DwS*ŷ+on8Ϊ$<ͻh\jM*lIv@!8QsmIhl`1rGE)u(/0@C䉠9%y~()Bn6rDA7]{ PIO#j1M1%lAyUT]ALu^:ͮsR]u7ZunĨZt4^v7 Waq\kJl%f.kp؜E`EG~rbܠðAO;;o(LW[.$81X ԃbBSEfbK<HLI1R]p ]^oj'>! %vF%$ 1_Bn,S|bvxeO 6`ׯWKAC^RTqu4:\EIwVej׏&VJزqvˤD|EHG GoG5m'm 2[fpԳ]u8X£~jOˍđaHJGSm7I~OʠRb,p ~VkRAT `=/{C^ZakA >lw-1Gdݜ,7fYͨTP<𐶜l+] NbqU;j7,Ϧ|CRC6=Vy54%"Yѓ`C J"͆+>ʺr %|Y\iF+5_1*#Pq՗zk[*%ns/ꛂ^aC]l肷^S+V .!v}z҂K_6o- Dl6wAX=|Mi[u 9?9kWwy<%!ׄ'RP{EaA~yFl~b Trd7@X #Fd\LD;')B\F|f=F ?pxRjj0yè]dPF ! I&΂_:iZ'`2rA<֢%*w0@`uHkqHoGU|z}XΧnq~aTd9iXX0fE]0i#P(Ѐ".BH{uKt*|qXDqcF9qDZr∁&wV1LM3L  2@qNT`/n5^x L\pH!oKW󍻛=0U誓42ZmqE!*naüVqJ *1cCzI0F-*<<# Jcyj,"]8&%ʺ@fUuv͇P~w)޾3+8*V:c`k[=R]bmh#Q=PHGWԔg Ӗ`(źZzSu`Rgt{Mc8%cuI(^Q _15bf=a̢L*kQ2+PWlfP/ZOF>8{ϡMH<چ4${uJLנĔݴ/˹q<64ú7K pfF[cӼhkN`"z'dEC?>V~yUͦ,28;p{ȩ]IѧHe"Q_!D͚e++Տc4{ztZ:eJ^UMFVK XCo"C 5={ T^uPd٫AmQ[ cd/x+r..>rץ4"j@c]O ZQiyt>:aT)m)N^<.A_5QBb0W4qa<bu-K^0dOiOh}àƼ܋eOr2;4Ĵ=A2[O6+ٷ%7vmo9GIUB.ѐ?^l۪ӕ$0fm݇zԜaLmΑfO2C#onUm'o_Y׉~b* 7V:귇NrMy@?j7ge]pf^;d7GryANEnx͋fAK"W酛O84?NߛP l5S뾧}k!nxAm9?0)k7%JGX T(aīVYNnOcR[Utz3ߏ[F姷5*wf&huM֗yߞ=digtV գ 0wC==M; F7''3|~o 1m/QeU4_ 9 : LvBI"{/3k$婫Ii:Pu9$6-. JKNhZ#ꐟQ$N^D-/qp9ݔϳ(}?'^"m\-I>HUIl a!FWx<{ݤh&E[ÇZuaD0S i :ͽL2Sdb|nՏn|~}?4TtvQUo6M@}?;4-ȷyMbB;)5ΝJ*ȝİV;F<~DɁח 9y|۳3鎋duB?wO;C{ɊݱrfL)g.ݡj,DdKd}:͏Ce.S1^qH^[kHF$fIn(NrHN>h#oxKdq9RS"YV/H3k\_/+{Q_C[{_O?́w\4GTu]68V" ÚW?̮adԅQ/y{U\l::]CG'_K P> Q- tz(4_=v|(L%:.X P3( fۥEQ8N%JE-YaW1- D=hQ0ogeeТ֚q ϢPDꦪ_9(TmT9uq,?v"9c +D ?H#?V(D۩K:}c)S u/Pc#kTM+ZeɵX"{zXSmL􂔂~̃iL"1S2CT-cF)uC;fR˘AќK̉`+TBɌa׭1vc QVB 2t1. {̈5~ܐ?w>nħ/.fXOTx9k]KUSjaf>|Y\Йٕ 맙p|X'\Ya: a_Sq:.]G3_Lg/(7׷P‡]sw?-?-ޭ7Y-Fx+ڸ&"X+4&goWZ-.={\aUu8VC>u+ W?]o4S FJuJׯ=S 0ZD5"%\+:?8Pf4NJeE M0Wt=ڬp7E,Al4_̮ 1uZ4r_ Ӫ[4q6v&4 ¸aʩ5 $QNkx"1|9u\KDVSz vzxk)TwB3S? J@\->N7u4V+{;6ߏƢHܻ¦闛D-Jrb=r8m(z ϛsHʯe{3B\K.g/ܩ9w_߀(g\aL5p.s5>[V )wݥ;|]>ϵXz!w٥cbxW> #!E &vn0;# >N9ɼ!KyбH18.o"b4+.Gb b]c I2V\ bNjۻhq1|:Z\-.ƔcFF1SŘb,hq1rK1p\)Ě{aZ8ǜ"0#B[m!Lj4ga&n庈1.]/&$F &BU7 C-.&bhq1yp}( SCa޸- ̇sQB_S-Õ^TGC9L<&zu鯧jqdj.gQNYA^CwV\Oni%RCm8$N5yFu4=j86X`z8ՐE4BéBI{qzj`? kV*j(2,=i{҆eؼdb]@WmWn2_Þ N2x;n/ϔ̨16 w *fuop$rUa__qE{ܰw~]Qjc~XM4ʦ}n>DX11n因Sdk-R35a!#7(:4j1މnTv"ePkbc/ ֩6ݢu[2r)5&*[ NDYAuN i(KԌnMXM4ʦ$ntӸi ND1:} Eѭ F罯݋n!,u[ ND궠SL{kF&,d&eS`71ދ4VuL'v}[wAJ⻚kF&,d&cS F7zbePtb'ut߷P35a!#7RXp'aI/[ jMc: :eUw&n XM4ʦ}ntq:n2c:cݺ+:=ݚhM)2UkO}Mz! m-PͩJ=ЩQO\ m1Pel)*]#5OIOE9c/5 \衝T+RVjԡ;aI=g٥zAR/H@99S@: F=A>4Ɯjԡ{;19WƜjMxjx5f s17 ՘9`TcN5&=r9 Ls17 D՘(՘SQO8ej)x1s 5^YwY(j֘%iTcEYBWcTcN5F=X1K*j1 o՘"8՘SIOP՘.Tcn`ë1+&T1s^Y'(j01+M>Tcn4>fZܟj̣1՘!hTcN5F=A :3 S9՘01c[\EM51Ԙp.՚TcN5<״)|D16O?98'D0빺,`'n6o .Q3S 0ׄjY?4Y~:؇3/ 6^SL8l>[L*(>9w^ cz?+9Xl}Zqp{1ne9*sJ{{crOKh~(y0˙-/K ɋ_~8WU-l,pXBf^0igD} l ~™ZD-a4g9a(}j ) B, ox/OP*Ձqee||ynX,.o?ye`dzy!tA祾@eGnt{_@_O_ X$V!? :ܪ);pN i9L9[N49X "dGɑs:|bQ@nb)j!m7*C}UΑ05\ę8J Dͅ׊HکH9xY9&|b1!Cca:jZiδiAԁoLj_rB;kDWBBzmBZHGؗ$x2fVf[ͩED$ K mQ)B D3*&Ip)G^x*'Hdq.L -M)R2RB HEcFF8֛\+='-8;x6A31ӉTTC#B"իdZZ25RzUȸAt"\..AUΖXN1}Kzv^yDO׺X3~.M~LH'f2Nsksk^KcC^y:7`@%ovכ!/q_]v.+ EAL{w?Z|ܼ..1jS^b&z*\.fl1C&"g֟VsD+_,RD*Jz._bDı/lYT@B9#1ED^zֆYWN`]zٹ_bŃWJY9 ~zVnfiuvAٙZe价ɋ7ٿOH `[A ^0seJčs)Eu1*vZT}*Sg0@A} Gy^ &֟o:]xeO/ÜqUvBAAV27I]|ԃdc}#~̩O}clU 0Be* L*Rd;~,=>QTcZ iO9a~YՃb:NNpEA mCeUL3iF0\ް_QEuHx Lp8Pkw"F\ZܶA>UۋowVvx-44UN@x7L`ޥ99PL0tz!m%ij½f)$KYwinw?ίn!d@R Í>Y]Y;:C/HAqQ(3S̱z0FZNp6?}Zٻe|rR1:rG1gHof?ܦ2R(QwZ;ۊ8(Xw_7yvM;vLhTe5lcZj :~(,_MOV|߅|=Y}]^¿;XꃃT$nLom+t}=9ѡNvڄבI/hʄeeZ )n)f4Qn:%JvPgG-;mj]!_|xjԏc]NkeiJkRiP"B?jL+2Ԉ↗SHmQ}Ԍpg>'1.}0JOn /V &eI$EG9`ΕO!yN"M2x֛F(g ?}*8R^ }:DwO?bbI "ϔ'8Fa%a/qBqEu{DOj97nԇ>ME\󶇇x~Φ(nG1MmOkJOVayJ l֒:zkcjՅ%BJ \!ɤmCS,ÊaoK8 S,•wAdӂc:$u8W(q!}EHd?ّOcz-m 2Q=)j?D/ꨶ!5$ukS VÿSr EX+a aT2WBP1VJ T*Wڪ\܃)kP ͵zwW`Q@KA3C^Hd GT0S%B~P;"7WI78f,ju2=@Qc6@%K.Yح?Kۗ+bDߞm<ČǜlH@6kL  <DDHv179~\9)|OMsZ7Vfs~ !JLaZzrqđV+B8`{i/ANT7n/ދxU9. Y1i"z!QX=#v¸͸u{#t"=/Oﱨ0tPO+4b3ǽ69+,18! F M0G- vR(Wx\"]dgt͐=d7ǰzp[a'=,jHQ}2o}uX̥ڋ1rPhL5Ksc>2g ZIe{TW!sg RNhIT_SMR,5Mk WLs%~*Վt:W:kDӹp^B4\q )\Qc<`RLz)g0ӌ*6,>֝QN)NƺZmc0 Mv7vQ%M{x<BBPtA M8F o-6Z1aO=|qQ;/XIF|0BMa?]ioG+9Te @_ %){R{8! Hh>#1%j _>*wV"$/ 0RԚc!ZQt4"_bBrb3t{ "o?קk6g.^\=wʏ0#ͪɥ42aRjJ*7 WHE"ft+&7n2ϟh}oxlq̤G 7§_Au&j?ۯͽ͸.5TNv}Is.(y7Hdj{AHTrܢL], T"8T+Ƹ:7- [ ti 69h=EɦelΟ[r>GmS~%:`SnK$I c IMJoEF6s?PYYQzK`mE>v[iK! Z_4wI3OM`IMu{lTTfS"hr:M;8gHb( f!k1rv,8 :ᘚvjFԔfuωF㟑ljN]gΉI<3ˌ2JxeT9 ƙohSp)5#AYiS~ S@{Wwѣ|V76lnR12 c-#X&݉ͅT(1 >c??mQJ^:/SP&+n$/939aIڢZUĨEI!n,䝦1 *ky,D{67y?-f,D;1_!%f|#]$fI- ʀ Wۖ W>D Yi| u4QR')4IB"R-#C'33BEYTlwJq6pMx\;u`u㋺(o;696yF&<-0 $(aR4*ycTY/q oe q*=D.,]G{g}akquhݽyE :)'\l*Gv5m1\?_rA4nL5nNh2oG7rGry2h QTFlpR\ේ'p[;rN^;J{{n{Q~hm1k|+@QbG7Xv';<O3|`4hQz.t'VwߟZ?cvTR[|wv,'?vMx?t̎-~3p"" 4)iXF*,qSJ&I1M2MW GvDϜ)8|x_/6Z8DŽa`a`Є=/n*%O3#Rr * 5ni`ޤH@:n=!'c/^b.VU:d{]z{_caa` t#+N.BVh.Mǥ(3. Ew₋օ|}CBh0iuk(aTk-gI3Jr+8I\R.bGFh9F^>vAo#Y7sBqct/L.8-5c]qH1f0?p8@4h&)n0~e?N #LUUMtb8OԒ]R ׽J>CJȓޥs/N9 a_#HTd~һrwLJ!BޜLwv; Q7Y ϊ~iZ>V׋k҅bnM:1VxX iGd$JtUcvyY|('9r1\ncs׺˸򋼊/2A.2A ^pR !,o+08MR,9LT!h X5+^t*k+aZ'ZO0ӊ r<@i&ᮅrBRh`YψJXgѨN9XnW'2uC-6DۻqND`:xa?wǪi{.~~|f WNڛۋ𳽼'ח(#4~au/g[~=;'!f_ʗHy7Y2GM]g7  ߀ ș,xP<܄6:Kä0q 8ď9@(Ϣ;-8n8T>hD=[ A Ot_ߌ'1?Q3FsB\9 ϒqz7|.,;:k`\fŮr݈gߟw>/g"d "m3>?yS.g)1rZD%7/ %ɂJ &ɍOΛhspwۻ~N셋 ]Pm*AtF¡'Jp:{}6Ve5&5 ؎-`o GpA` 4ȰH 4ȰH&,Rs1O3m*1"؈I+"@qb@`xRAFj%̚T:PL˵Yȷua#amf@ip*}Z1Ƞ Z1Ƞ&hES] xD!%I' r DF~JRb:P':*8՞=#LjCVPhr} ׀ r ׀ 5 MRzɈ@`3;ɤSCJ(Rh 6DҠn@NVl}0M:1cWGƠ'mFƘ]?0eLQ$Zd\ieC4Jf^K 8Uk;qQebWϲMxm]3 ܤ`L8!rKp4Y&H'.s)EdY_PO[Ӯ+N<FMu3,B+eה/!_,"x> %t./1&PQ͊K +ۗ{r*Q.H4 RZbթsӢ/$4?AKK 7iHuUTwj.y'F`o=GË́$ XH a k&kI)g OaLLaQ39Q<Lj @5B$ 1j]Nt sy,@cuN6D9P#*`iJǀ bR1_D(6YI5"`*0Y¼I$Kveq[r'W.f`_zbNt?5š]}+3_g ۹z%4q\@E̡bKq\>; ZkLQ)7j#d:,꟔H)f(p&R |$ uYWʐ7eW*|i/#L_ܶ EQ`Q]SP?6~3zd(D>E1.LWT?ng-x5Y:7~z=;]z}ufY|o5yϥXGt%6{-@Y!.jj{G8ۻ'D>7.ϫɪه kM "uQfL% O!ȻXQF;mڣOZl(i >~4OJ(l,̿.Y5]l \Me'4$uEC0odѐtO M{CcLAD ;V[h%|@ȥ?E- o.OOP ~Ew9MUJђJh*ZV>7rc~Gɨ5.RGv)aR\Z@ r8;qvЌר":}kQ!5^C^|>VI  1NmQ;OԩƵsQx >+2 1RlQNdYY%\춨[G=]ׂ#7M`I~SA*k^ Sr99AٜZO/!i7jJdI_IօdRbH¡}ZX'KR-#E>>HKJN6";\ œN gh)y̸w?_!@$Q| .q6η'6Y%{[ҨZxdqOwXU$~PD>_/8)׵s>"ʍC p40ʸ` 2 hIRql y-JqJTg F^}f\E&H^㸪^27GdiI$ZƦq`̦w'љ˹LԒB] {i,e{蝱ba~Xw56>)5*N:f|xƇ茁};qi!h|L*_D7ޘۥo-C"H SHo?8bta;F8C췡P\(AZaEC/BSBL+!a\&+Z1Dy-Եt9wӰ<\SOo7= 1jA9gᛈ3YͩMA{#R(>vP${ T74 S31CrĄ<#wl>b榅.X́yzeyϽxqى7Iѵ#ULmY0,ܰ4p%C&q)e(I8bvS~og?tl]uP\X.s:3!s<0^(B1\'|}U% mC7FA=Ƈ>O~t瓭1|i= yfFo)ڻhק.btGa=9>P#gaTeLP_r}ZSK󄏗nf*-إoX^\rZU#߸~[˜޹;-mꊻ$Q^RԤ:JTݰ+ nN_/M/PBP/+SIpg/ %rM(Zr xeOZ|ucRS yN sq1,FrT*gXIyB# *2= #Ga^mxRGR48IHԜ2`I,8k3L1uFxMGǹEM0!ˤˉR2IA=aDaTe,w|PJX1g3#E}çao5Ɉ.kٚĮzVU21-U\Jpf(/ztN\6(ŻQ-Fa 52Qo`2AI}a,Hg޻ޫ]i [Ɲ6vxY0b>W e/_9Z]&S,L'7^"Z+:NWvB]B$M?;!'~3t3f )}(4-σ.>"FqToJ,btऒʡZ4}fyƨ5On(Eq#5!TGpנeuٮ]|ҺN3tM RM\A 5Y=5j0λ ˯CVu3]An2gYj?=)U'ݽv_&&<,~Mۃ@))"/[L'5#d> ~xd>P`Tf_X.),g,u@ ۜ~?dv\MF(l9ܾbCnաɞ N <8Ϝ\wQhwߕ g)$DF(@8Åx92P)(+p%Rfu?T w51P*EӉ*Sﭶ ya b`EeFDP2E\ɜ\/UՖKW wpRrpQ6&S|v֤c)fs9נ "863B`ʄL灋<"cY(cumFI;j (uq%#(WpOZfda(5QDF\Ar/"d(e(\r fDZ`JYqSg̡Ii>sYDŽ  JyuXMX[Ɩ@ ~--Nz=.* y M붪ޜUG䆱Dz3(&NzPR'4%߃}fRC1|TJS+&DA/;}_,xћ-ю:JRQ9-+ |Tg\FҬF,#>eQs}_~^Se&f䫈òӐ0Mp+f~yS=_4+lU8bVuȺp 8Qd4˂B{rd2PpaΘ4@Lf&Y{ 1)FAd2!׊[tй0:IR!eNHlt-&&hKjodGESqM/i9P^<s~S_M U̓C0`J#>m"!$0ZQz/]s> ZZ5!Qp@}c= !~'(W R*c ֺ_; `kY:Ѳ4RlK]ՎDǏ+K&k/e/~-E=3q W' 8 CZ/MbP)~}.˟ 8 NczrDA|55?hIlQ``szoZ)< ~pb6rZ 0]T"a%0U6wH{U;u~Lq79e׹~O׉ ƺxKX)?ph:x=3FSN4M}u΃aq~+ZեU)4/4 5kd ˛&Wޫ_R5ulѠ] (T|Mr[ם-^Rr2~"UuݨhAV59BshմqrkB\AӐE:3L演D("ʥ>8L(! `F/hwߕm hh JZ *}(,WY3BLJ(fHu, 5(Œ~cݏm@ vd4` )>2ᠮw*XR`%EQlS}s//d:ruyZ‡|1\TŒx&R8C>.[`-Fvut?H3jһ…Kd4yγ! {W^sȂ.˃srH[T\ t~ g] BI6"df7FC#Ca$?6{mz>\`AϺ]aek:,0}XLoCԀV+֌nn{~dǽ]ӿ`+Q!gBLt8uKٸ!Qʊ Vd:㵈͟;nO86UĔbɱ iYo@dD׹tho 0Py$/vOwm[C"%(cԶgxXb ĝ݄>̐~ftfg6`  .-K./|C" QJQ ٠0%v4(/mW1auDK ؞ȡ=kg(f|bVxzI Oo@hKprz4Us({dӝk0 XV( FZ@* *K4'A)gwվ\|}6`',4zHqJ+W 8 2&bs1&؇&cTK㗳U83(5Yԅӥc?9WMQJMU&/]MQ$@FFweńz7yAiv R;)+O7lz(0 A> Ft>~|_,&\>f|P͌} vPN?ZTeb&bBI+,w)1]vt CϧbEᆰk>FL/LγsϧZ Rx.(Ýj2Yݘ>lI-b7oY0wg8KxXɱ-X{sY!+? m|G&]:4h~.6<:,1,rz6k¢喲! y&Z˦41z߻a[]NoTn̻P춽[}DDքraSsϻ)P9x2H1ߨ.PuKݚWn;68w ԀSީt^98h=lxzo+VN㛛';Gpa]f[B/Pnʇ &>~wagFpKH jP DFd5HG~Jw~qK|R*gP)u8ղR Iേ\uQI\J<|Af-/ZDV@״ y2VMHeQa14̘К_^ɏ# y 9g“>dLېk>j릆S,54| I.3Et.U˨OWݜ[zyq Ń M$TUHv_MK,VgK砨`Zm\L`?E{h&#V_K#HԩM|dnD(,eYa7-V\&Ewݝ 1s+5N\Dwg$jD JoltT&[e̥%ob5)߷ozI?7)|'=]TtVS7WezZT s4W}&"`1igRɜ yt)]Yo#Ir+^\<"/bO{1^agAU-jRx0ݑER*^,AI[3m̌/̌*sFsL3wZ{#f״^S&+RP+uZQ/Xgb2\t0Q'BRJ_j4$FC^*!zKˉL-vJCJAq7iOK8@w ""r۠"iTZ:K,@Z~_T+Ղok;pbi^@Od6V$(s«^\oK&K050a D;q\gՙQKf׎ry*!BLZ~ɛ,Cnl]Ŏ] :.RjSèá QFRh8+FĥQS: = ]h_O/T w{;HRC%Zy9IaDUpR/U=tS( xJ*ҔB||~\G))?1h˲Ơ! 1yi 2'*詄M1qڻ`X;#ok d)C#lXP|֚–R-hΠz1jLn MhЫdhoN91}Lppm3寗FHևYE i!pOOdt [aSqxYW8 P4,j2s쮼=0.s5]o< >kvo}2׫+J(7{?_)E>H_\_k6=h@OXr 9B(c D"Ja#mrMn9eֽF؄Ckz{TKQYsϯiך0ڟ 3ks>%@=IԮگJ>,p\Pt׊l&gob5'3&\\X|.|ܺjJ͗6$+AJ T0IzPQ%%h{VQ0Y F"uP*W) P8ʂSVP*7aZs?Cga ZHpC2X!7>|i!=se 0iJ/f."ǫpOglS|kd4 >d(Zkv8i33<=yՏ*`rn"ôjoV/OqV[(i/I >]^Mo T$i͛B6ՌI-b^]kuǟp&?M'SoEkOKƎBճǃ6h1gt@p|gnh<=##&=gˍ!hZ;fQs†J)i_8W/Ώ1g'g/y/ |(#9o& (p0Zˈ-̱ с:hfUQ x+6 ÿs4|m<ƒ x$rcyECk{I߹u1f>ZD7c\Wv sć+C]-/( 8+|pGY9{)9Ԕ,pBH]I~|%?,afxEbHq#N<>z1(: r¯ $$ḫ :@z@S6LOP' 1[GĎpphL>rm"clIzc 3P}e~QA ҁ1= W6B2_^!tY:ڣ(tQ=\WvmyF Md0)0#{a)ZW]Jok_ZEJW淓thlGnrN&@|HuL<=]Ž(𯫂Tp` RYry[8BJҒKgD,Y\^_T&* %@)A(?h5W9|QfR^Ve{^CNNt>I*&@ UyD Xm4/bmүeh=l9Ij<BD"6FY{jw|Yw|Y=߭nc@ePZ\T\;fK"vFc<3ze ElvV65Z-"PxF9h&kfིA+Gip_Hg0WNV& y(~YghN{CۺV 5>L3t2;OH %(eԕe8T@%( ^Q9[B&`J4J`H#@9@z:guΒ%9ε@:m!@5+Y#^RӚ"Ya [NB: (g,gi=!$1:2p5\op砹Ȥ4& Z)DǛ; PS1]k(]l9?9[@ (Ըr%Q U1SKrPԔ[x6_(3om5 5=+q2xN#dկ\pj)N+-Jn-,EoNo;;ۇNf5I_f~bR2Mg&RSas2EHM (cL1PjADKfymTj]6FOzbhdRFT/`k?M??c!YRH}>uvӄ~-D@g%A5PJDA%6zt|0|oG1HDobL3Q|JUR[F %H\`0RmŨ\Z`e -]`yx b ݺUcס w 4 QQ {m=ծM)ӻ󝱍V9Ap7@rlD{T0apr4x(!IR"ոHa)P`YH^xkLhT {hwPv4"'q3{Bg 'nFF~s&STǩa' mKhK܊D+@-$NΑ )q2 Z[Z *:l-Xͳi$h4#FeEQF4A6B1b$21!(TY2%7A&%HjvA[ \E+JkCtho_ZkP2 #'J(^\A{Kc]@hѻ#IO_1ݶ Mu2JGۄo3CqZ_[5řa&)L#%^JGcR0sRb){h5%Kͪhtqy3cAp?\[*/׷7œ,o~yaɍEެu㓋x/bgH'q~8x8|v1ߝ@=^/~>;K{5'7ϟOjz4 *n݆>a2MKIkESV*g+l}jgyj"AXVZH Z( ןLv$xH,96X$lnXmoZ,Iq֖t^Y֙*EW?7ܖ yQd +/[oFo@MMk.ࢿ!P10s4<jU3˵| &Բ՚˫"|h;1{&7I ʇm^`g bj|#z5VSiJBZ#XdfnnNK9J?աW%IX|2زf)M1%}3H6Ƌ`Z)*Wgm|pfJLJH%9N6T% EŽ@Lx( R@&ħ MEl>܀⃆YYId>}n 4F8Y_dw*{*Ug׮8kȲIQ!)ޭ2b0IX׍Fw!U`XC0 pjbR-9]850ue`9}%M5Ī jr+`d>2j3hqU7w_cO'$sO_<ѮwTO*l΂s/C\: ipcOn-7+!ʉL\$F%Ȱ"%WPVALE2 (-2֯.PMn꘻b.n0v.u /^c&휜ŪiָTqT*ӻ5.Ԗ9`wb=$]uӯg/VĞqKJLvὰ@\I0Jq,Jp)$"AXXIH{ %fŨ%R `ʌq*TKeJ&efMɉƐ䆉՛ 4d7{sڜl%{($?u 2MJcTB 9b@ 0&G4y1޼6.llѤH n8\B3)2v *29$x4.,4]:9a?7nޥOWʩǫ1<2>boZ=(2-V-MydH]H^wqVm>B> (|Qd Beh3Ҍ[dRWE-9AH*=&N+ E%%rjAw]86b(˾jvՅF8oq?C&w3l85y`Sd OT.ʼQRH2Hz9tWq9"œ&7;6n4;#y$*ȘO.o역5UyaU>+Np{si?++I`eh~BChc\R`nҔAh.Z($,΋V`+;W07Y UoQZUx_%v<TVok_hC-H0%R` X[ xn0aeg$G@ # VVlӜ T(As"3{ş5]l_.~ݏb.0|7_HZA r2ie Ir%t3,d15D]F}YvVqgĖU oHA ˑ_PJY}{ϊ }ؚ`u)7`㛾P#5+fR7~7I#Rd{81`bh,ooPnYXBhNJ,KC h`?'bH%kM >YD. R /ܭY60RCIȄD֤#;FIw/ujUC%J\G ەH9¸vߘܚ ݒFBʆ|Qy^EIFXPYm>M$QibN*?wP2n@VZ5&EUnҸ.PRv)@zViላh?F84Dc#f"}3C OW},I! 2̈́=(Ɂ9w1Ь\LW|iOakmTRNǣxDE#ev97L>q;D0&xBt $x3T#^*X1(A <Ŋ0mT RqA kj+TeA8֏ 2𸿽[-d!֞TP*9-AVީS!DNDݹ"6:6^;x>m']bܰ&^1͖$3Ss!@`DR66@3e @T&u! X<f`8kĢ4^e;i^o<*nTЩSqê&k0Ce2AL[(aRKgPN8W\\Aw|DAI𡓟,I^6Z >G9sPE1$e4K \JXu2)UȘBZ D44;6UvT}0`H|Gj@*vt|3ٸMw9mioT8wsTBHBYvg;+Y9ڧ]{ZJIh#\289ҎHIyeLQLG }ѠQ(C_ PjqqJ6ږwC.Lm'E2_JY|9+/g̗Xs-"&3R3=0@$jl`PX9' L/$C_X[fW~!USS~!mueQ#^i:6e {魃PKr Y%fԺ̡2Ax$\첇9ip̸у}`Tf zPYZcqZ(%Ltg4b.dP: D`Hϙ liS`lF>gz!@_Ku4nW ^ݐV$FݕCvEIHEf$I#ρ*2p+A{`#iZRJ1$ YCdHȘ=X9*5v!\ OδdeB"MސAIX";:f ?y3!3ҷ paH84h!+ ytgY  QD#(!h>C:Tr':C%0"\<8Y%g%c*|:M]8̵7[lŷO-d~a)xC.ߊo~ _.o^Ȉ[:bԝOog !^OiN;v߹^݇+~! ЖLl=ARQt6oyc3ȶ4`6tnnۿ Ч_zC0*Bo~<<#ZXX1Y0~3[$hWBMLP*њ~;&&&&ؾ`9lCYs2'YBt\Y%EΉ|ƒ'S1E9|kqF+ȖJTm\["fLcBQؓpؒyai5(MUxa`gp$$v>{}+6ØxAҦC=<]tčg輓Ixŝrv_6ZuTZ B޿z򣛾8/~Yח5Oxxr8a ʺ6Z3 z$V&@aF`ig% ]h]W'% Hxc!/ grJLMao>NK2KK6TKut\t8+[3-6';p+49x _o~Qh5HpYSr"S r@'y%s`l+ $M]BqF W]iMVެo֞HsCQt;2sZRf93yI|a|W61vg>lLJ?uأvD/(ndv *z90棤ԈJ:}YIǣ*lLYse9y0!z$ Gǔ!QẼ{S:A?m#Vnzk\:`43CO+@r\$dYLBV1stQKU!7+acd>cY/@ fʛ$( H-_K:FQʁU hJm7+>v*Q5rܕ6/ӗzA[0qV +W '\hH"Z,^za+غ Wfr0h8Oe :A]2A.'c"C8.ex ҇=GpxQ51@e0h'fs*Dd6YX@cDLKB$ B؆cDnԷ,7㸍UU-,6(/ψh5GUdz4D29?6Z! S^9ILra׉~sw-$y_ ~!_y?8ӦFGXރ6Npڑ eR75@2ï#g\pמGѯv! ]5JkZ3lZγϴ\S˰SnH2%V!=Pne&Fӧ Pv ;Zqez!0%BP GJ}XEX#BxnCmijbJ!$ L۟󡩽i*e cLFTd4VXVvh?[&ejuCU+eZ67*!sި՚Twq*]ϷK6Y8O6w緳駅 _Kaz~z˗l Y+ze ;7 PEQبYzA`BǮ!y0NaP_pB{'QA(IZobK1$F zj`趗!mVB] =&T KyF{KƆ|nȷȅ#(|gnWt#||nvƢuٷhQ]Mn~l??ԦY=|*^J/@{մU=ir7#jQEX82%7r{{*WK LV-Lޙ"@tx]{5{c+z%D:dA ;m{ސ3칍s992k}́ C,aHHYOzpf%bsT[bvvྨ[S-J-Oh!]@0)Pew B5gY9G8^@ @3 @!HhKrez%ޥݮ%CDMJAzS6vO/Q[:F2UNV/ J< bQDI$EPC(x0TTsc.- OI5rĺck{bKAnxE:nW˳mMXe}5gRZǂBo&%WVk'NMDed0̨|p-dTr{,ĵ^QA}aӿ{;@k1emIբ(~է5՝T,/-,^Tg̿w.v1RFOQ9ѳ?.(.*ѳq/i' G٫Kp HKoUKv]MG=!0 h5v~ _c!쉡d(P8ސMJ@!^V&G/yo{e饲 ]L^/ݞjcd+[Xu5 ,d%7Hhyz}yȟ6C'kL~bz棸Bӫh2npɮZ^6? _^\{惡S^UyxNGdΎ| V/ƒsOY,{U7V| =Wf?\ӿNNy6\zTNwibm$ O5FAvA }Gv@k>m5 ڵvޖR!!p$SŹ1hԈwn",Hܛv>IR!!pZ$7Ҙ3WBϏeselgooՠc]Wp;Q/8ځuOdV2ojv>%չq;y ,n=\ALQd"::͖QvJ>SF)k]G5ybќ'#)ueEH\)DS3Tgi}DD“Wsh+RV',l,'K-v󡔿#|”M?%/_l=^2ߝޚ_r*AUv7 i.&a:8B3Ng 5Q]%h5MNTY#cOy6ϺS/լu@)hkNW*r +&yVIVI3 XnqɶR=h+I.MyCnZC-ZkPYR 33,f\)m\,|PZVfi_ҭOS||tvj_À% S㸕{AwmH2;xi /,v 2,:eӃ!e>%P<$-#Ub`~<陘Q7~5H @ cmbfbmEv8 Y_fѴʛ&H H6-U=.)JGB~9鳎hIyufCeqTy DgAȓ3%ߊwF' ԑM?,4SgbN+t!#dGu8cc ʔyL*!D+Fb6h d-27HocZӺ\}&+b Gl\iuZPc4mhl ߖ ǿ}HAO=V-hl*e<}TVK~I:2k.TVhI]dJR…ċ -dEu!@JD 0#+IQV&Z"KtJ "›R;qSn(x qZ傻3a 3Jw t2X"b)8A B>p>47EW>ɢIKkWo -θ"c,5 2MXTKQtLWIJZ} KQ-YQkEfs#oۃMYIV );;zkI;W{%(Rȫ_@5c:H UcAx}aUJmN98_QpZh%dN^Yt5MSEq}*csp$f Sр t+g>PrSc|'QέEo7톶 ag'IB[C~\o^3r.!@Ă-~StƣU3GNHktѧy(3sVmEecO=q-c.®g'L}DTU{u ,VyRJyuU*UMG!,1T T$V n{1rUh%_V>Rh~CW:Ũbfм,nơeM׵6N\n~np7'0+A[݄9TwP ptp; $ 賦HY]}9>k c ZvN.6RKR0]%ء&ؓB?}~Ԡߚ(1^O)=x\//JωoRܛj>X)Y1<'wodlWC'޲ά؛3_K8S*^jKE-PROǝtNٓ@|_=rsP@~`)aǢMU/hJ )8瀤BKJt̕v4M9UXn&2sr wx5ji)J;-ʕk?,KxP؀p"8jk[ ?ADp(lT҃U3=\zj\/{u1>w5 ~բaVqEQ;cJ$ qQ9n%1\kgC*Vb Hy!GieIt4 RUjutU*UQ{l%}{UtauJhģ*j "Q$ 2Yc};ъ`JY" \_G#zWͽx}qCRf.tG ׯv#ۍԣx,IrtqPJe"40\㣇nE~~lhYn_[?_QafGW7wh^,E*.Rp?{?rf2Z'Of})S:YtO+w%B֭8£x?.eIwؒUd/QӊQKS #}})Yk7&D&V̼\̭q/k8ifm{vVDp7|jvF̻p =rvW(lΦNf!8)4`Q0.,:E+ѳjݨ2=./8V" `4rnJ'x뉯Eq@~@0Ь)@HW |xӤڗAD7Ҁ1&JBtZphGmF?u< S\t/R,ow-Br8_;dJi\B JخgC%Kj>vaFKsdѸa ] 8ɏIy24Ϳ h$8B܁ܦ*vemsvC"YloHA0"PӐ$h0v&b9g{XR{,"|=w-GgN#XK.v8g{84xִvZť)w#BOCQ_)Δlnr/ddd|Q'B :fwEp} uu0%4*ћAPmv2_MvρS73,vM@?icʑ @?мƶt3JKI!Ƃ@\&A\89omv]o9}x5-QXZ,BEJ@bi5C;X'"8g {6RƪH]|dÐ &'!D,~B/1]m9wr"Ƚ͡Cq,mc?zf nexv>Lmm,a! C av&G3fA{j˞D# )^9lU7\_wa! 90 *|H!fi3}FSyHe0X~|ڦidZpvRVy:\K& m.2b}[.8$U$*<|_׋ pJ":Dx m ]85Kxuݩ.`M`M76͓Wpa%6OwTUt|#t6YB^1y̽:[Q)fZ>FdK>.ϲ\SFO?=e R|Ÿ\je>X}Z~w+Imqq"mz2zVr(H 6KTIb&У[׈?>,t::WJ;yzLNO oz"}hlbo>OdpqČl _oK qXxĹo_=#~5z[q[Z2g {#qǍ3{֩ALJ=MʭPŷϋY  j# ?/V RT#hs~0.RDpqWd+pF/ti 9Z'ә.%Wu4ÐNnP/9ZBV W/B ؀s#%P1R ֺNT mOn:y<%>Gm)np" l)0kX:ơc)qΙ D =b$]>Ct]1A ("oTF{p^Hpwl'c {һNRgj`πJ*+_Es;޳@;+PP0W?8%BlPk3*'륺0&[&Gr_J1hk^G2 oZ꺼x4i!0 ؂ec}5ƚ>6ÓGa#~bǣXLORB1g4^0w֎1199}SCFOX#W3!16Es k<{'x]v-ms\Ezt wTw0<|?3_E3y9)* $l@ohQh~WpfE5"Ѯ E$sR9BZUtU>$u 7e2u*AGt4ӄc}KMd;`cBwݓ+QVgn+ɄLj>\FEgdG |1-bm5Jwdӏؔ(RILEi2㺆|Sh8% Tp:=T?N9h5VMwzȞ3m٫an.ǯY5R;p.EiM%GI:12n2ƑV#XLKO69L D͗hVg \rf1+#4!O%X>A+"0},n.K<鈴{bhK|I4q >,Q6"*H3"δSLds14KF?dw5Bnij:ݽ˺j%e7_YzͲȾ;.NjSeSwZݬP̈PB T)=b;UOٔ"sX \q*(T\} ג5ku/覢;JCP^1BkXܯj/4f e%eaTŤf!{P*6q>ΡL52'5`O981 t\))dgQX0w2^\#Tv٩e'!;M >:'?H}n`/ksR+ųÉ j66ơU@L?]_įg2"y- A~9$+dW mFhTLBMǧZ[ʕ(:"bx )WX >.E$o5Ln~s";RTIZkZE)åjߌD(Jq (q8̧:F0N\>7KcŊ%0vX &as̝aی:̀;h#T?cf{˼mF'3zHj?*%H֛)wKMƱ?=;sɴi1@s}  T7x^t*aF6ư~c _V{qCrP&uFͣDѦrյPF8d jM7X;pAe #\g ;hPE5Jx}Gyw.JDͤDYorqUa080+A1 @AԬ"jyϙE3݉|뤱7}@Hu;S3=tbi:;! nC3= NӣNW|u9`꿥3I#`I-;NOV[4`E[1l@82T41\H5jG4j+9%Z&á']li*N*=6-6b)mDLzS-'S]tK)a¸Fhiw1ω7F"hujGj @Ӣc/b>w3+dgxbg0Y2Y2Y2Y(nmo%鄻/Fu_v/U*㫉c|W4$GxY,ic1+jc*FK\o`IHYp$cLG2 лv а'b)3ѥpV.<d $Y=Yܪ0Rjd p! T0"i,382k}Pi B@ s0[XpiL`EQyfkY \Y Fa"%tr80Cw$"Z\io4+p=)إB{)\Ylo'MM YE BfnDL+0`MFZowq U?#]K9UKbpƝؑsL֤Z.%b[Knr2/Zp0Bؕ2得KV?pAiz(G"-\FAAa%I]V&uP&U"S'GpuIQZhl`^*ːE{S1-I:#`|L@b9L?_G Is!FdQ0Q6ؐgqL~*Ó]bk?o 35tބO<-eo/v;`;ܹoh>;;`p|2I.1Shӝ-OR7D(FS<qE7ل=E|@ct|̌gӛ4?S'#sO9~u<韅˞L} _1Kj!sknNEr~g=8%L?? N/;؝83i o/Q~;͖2A׌U B]Eh.?-Z(L/I'=9/'doz3J.?L3˘Jr:j‚HK^d3; f灷: W~- - 0;WL@R O֚'ަfg3yj?r\[9lsk>\p_ϭ[9I?nakNBp*k)c$77nI\@%#bd`E3z4SHj dt oB TĶP -ThB VK|'X(S{{#`AQQl_…2dXDXJUc3ooÅV"Yb-F(Ikp@bĔR "* -\xLp!'….p -\X\}U(Dd*;#~'XRJ0bS%2ȂRYwt`A"VT(ÌաD+Y^}+DZ-]6{ s"yT qO;{K1X< JZ TObJ$4EM%B$h@PDjՎ*H"._%b)0B諑E^*KzTWX(Dp.,G.e+# Jb t Q;0,,܅2KyՎ*IyWAoME|n"IE0Pi9 0RD Q2 aF3J2 SG hF|ڈcۈOi#>mħ/Ⳃ~gLF>ooZ}5Hc*ᡇ=Ɣ)dOE(a< $HyGNh`.t@zqbS1@HyK#UB U;kB\8p} ,UM#ƾD:T罗R^{Q4gZ- WJ"i)11N>0BDA Yhb֩VJV 'Q-! TUUV Fb T=!3Y 6Y`?e\lelI+${xV[mKvǁneթb)y>>>~Hz=^jJ75-ԴxSzZPk-?Aj=R)!#Ot,jy{]W=ekIPA("Eh1SdI[m6KZcaÛnxv0uk׍w=[>篟aAuc'~_~?_Vly_Oc."0O]p~TkVY S{ݞ[.X!ۋȿ^fk/> $AB3Xlߟf@㛬|7LX1͗;ƒ$`fmmPItry{y/Ay6ICdp XfRv{p{"  B%C 5(/)ѷpAb,H E &mmȪIDM*ض[7m!V2sݾ?f]>?zڍ!_>^Is<\QQr>7rFyC1s 9T7q : +yMc^R4VVu :-K\c%nTZqL=>}W@O4V^Rב?w *N뛎~Ou~*c:H0iM*6hҩbA9ZaER̡ᐄ)F-mrl̲N,ի^'[x&݅i<_GN67ZaJh 7е4cl 7$WLTQX)͑R;kKc|@=o!۰؀ 1BP׼3އm甕9B[~qnZKc2w}3䫅s,!iwlw 'ρWɨI HZQQIY*d"M, 篅'+_ :_1x$]s)jU/@y쳚|cBݴ,ޟv ׼< %@yqnqt'DrJRe QtUq GLPǴxG jS6x2ԬCK$gJ|h&Mm&z!zl$ޟѲ!h3o-.%yelBB!ZW7Da}r: n|!XY=*?ܘ:[8VuCxq1zV0Ѵ]O?iy'1״} ژÙ氟ȴFT)DII,Oiz@'%}yQsV<֝Nu4Zaվ藘`pd#KRy'o(~<ɔQO#J{rk4*U , k-Շ<:!-Ԑ? @[>pQgU}$BrHm˺pPoheʅB2bUII}al8fڐmwq@Ǘ+~{!%=}FLB.i5n$Ӕ7ۍ$ mlhbN3)*VYfgMqmе}9ÝommmW۬7isôeY 1=cVγ3$[1ˡ0dLr^2'U J:#dnNfOɳ"y%ǜc*'yJ@Z%}'.ԆVfJ?1#f'O "=Y1T ¹ }ns#KRnA*;m3<[UIFtVt'%U*cltRm03vcfOJ_nӚk'Qi`93^Z%Ooyf0ucfgL9 GW_ޱ9+۾ºھ _dj8%PcʈD^S Xn0W}eeKSos H / /E=}J-ipI<[%+E沼xsv&YlUDh-]n _GYOGx \a!hs1wkMm>K'# &0cfO \7)_+8ak?x{kx/חx|m' FQ>h O߽X\8lVoVo;\Mn@g?}lo.ZGz|TN/?}'Jqw΅:"$3ygWOk8RKdM cՉ5'"yw8y5(P]/nϠ^#psL&I~8Z?Lj8$pjXPp@YNB`8^TOTw` #yjOߊ ]j'|P"FrJe$)@ ?s[O{ {p|}a]HCt>/\΀IN}k"31N0=l)՝=v@mL+ssuCY~#z>8Pc־HX۷Yʒ9EٷL1›k߭x{P ij \.'A$aYh-K[ @x~ [x?QƏaEL{ToF0+hB΀%Ly`Yw9~IxJ{7(_) aIA=jaО JpE<L^z@}ALl4+) ڞ+ը&⋁r~`zT:;SC@<0 cz&(`:~`̛S1dA1mo@@\?[,I8$^%&&-WAn6˃ B{pK Uc`IJkWK dʭxEԎ3q8bPrBiBI{PPcwf봄bDcIu@ =Df-bwN/TŘE?GVmAoߠRk9d8p{Zħm;KK`,13+КGHћl=$e%P5=Y>(+(cqod|7 in&H\]ᖘ}lԦ7Gb9.AAM!J+cpɺK뇃WiѸ\wP\5Zċlt:_B.oM_Ue`{gٯGqoY T\1ߜUWw^]u\7UƷE|{^_~~WϾZ*w"K>`jώi:^mY}oRσ^*kKogًz? wsCn>qiDgņkMs7U':aWo__C5uQzOW5׹)e\>W catrNzz7W.kJ3>}^gGn;#q~+̟s!8& gnJ|ntb~( S?Qpࢣ59G'yƠ$@{I\wPG<(x0={;\fzCH1p`NثfaǰW@}$eۘ}$oj/0WXѠ\wP IA c@7T}ίcd?t_^v_{m9zSCt֬۾Q ڕ v9x[}"z'_J<G+>Luٻ޶q-U;7J~fLI:Ŵ0(J4[n ;v#GN "ytL^ov~[H(}{яW펹#iKn/dG5:ܪ^?w_9&/Xryy~]AFUv? /'WϗGg*WgGV<3c4$n5i'kr}/k f,oK7|5.hЉ\)߼Q=5JRM'x/OYȡl8k7vm۞uiw{ C̍#1Qx;L`{ _ߞuOBlRrL~zmuNGqc܍PRﶗ”6ϣ 4^Sax}Y~ǿjܛl  ؾـal佯9ߧח[}?_0ypujl;WEUtCa C;?/yBY2.ET39S)e(ĈJd)U-Ob/?I>Z j<=ڧ'jm}l:g`=E= vޟYWV|mԛ Nnv}i?÷O[w7XEd"}Ɨ%gL@p9 <9' xSCI:cs{H DxO`ᒺ7AN\]ucES#h4( TT!EB"vÈS򙄍Ǟmx8Ǭ˶mx8CB7Gn|熎uh [R F$)8 ~2@|ǁ2))w^;&0nhua~p') E&dXfLDN;6ܐJd %Y'Z;Fe+#'<@SoմTč7y;n7k#'g5:E^7_ E`,yfrnڋS6 jgUo(S2* %"4q+ۑߎ,"Hr)WMV~䂚H)6,:?_nL[~,& <%ؘ?ylr K X<{HX8Jp"P B$qngІ :M14 "*~qm \'8\ܧ-&nqh_`kX'Co#c̛C.Ղb$Zؾy켲9uwjVLvj`ZM3N0~GN\P0j'U'cbW14:D{3d=g*nȻYJ7!4mo/6h{-7CZY Qxqz3Mi+k3u=D A{gEWe㿒?Lf1|Wݭ [.Z/s5f^]Yќei]^leJ5uCPtůRW, 6/.]Zr-Jw>z_hǂzkޞޚ'\?PJ%pt9C JFuY)+(ZDpB)`ّJOٰJW?W7/k/Tށ HնRpJ=wUyf)$sG*6O;AjMlQf*_(H9#`@G?ZOH =TBmAGxG:HwQ  o]kõљ)[\ёB:FBCs(Opn4-BTI.fJތ"$ rUj!5L1Q\DKU)SCJ%B8` kJ 2BJ#Zlַ%#FJqa=ۦnI*dֆ~m;U9HRLe=!vOħxP)hi}VDY܎ *Mɤjx9C0cs*Sɽ]T  IVW%w1W`0uExU ˶0O+G˺ 9vqOk&1aߔMA>Pޖ>X o6?oPSQ>~U5P:Z-X٨f4 h*1e,XUQE%UhLPrdr>d$E-b۩Mqm9þ aѡyChjҘ(w%|$˕Qh^[?&$4nK@ٖ|4h`5%hm7Z޴q Ѹel/_8(2z6%K 9d=MZPG.$t%GPsThq/̖?d2AqUQ$J%ۊaRL!}h` .YF+å.D5.RdYffhe! *pŽʲ%Qf {Vnmdn@cf$K/f0Iq9$1N5AKC,'BŴeʂs4%\!KJxxP%ڡ:))zV*j9YNӆ#OSevy\hh:\Hy˛pAkM$i2 Qt>qA ,S$Du[1W(s˶>~IFiƆuA¨&.!&c!F)*YoACXKZ5Zn !Jq,"1A48Ih'{ >02͆陈9Ck4T=zP)㢆z(%[K0buu\Eb X63ZF~{;w|T"WϮ1I"1rEc>~'(p A( je-Źe=UfN} m;ީUt{sI^ ^Ҫ 7/y>rYw2?7HO"(Č k"gC쉷gRtM)<~ڪ?~7ː?UQջUâȬ_gDU藲m=0RZ~l.asn_obށ\*diwu SzZ-\4<߆e?bUƥM$M/b+K) X,hLQLxIr-vRi'kRZV>Hi( *2O},ӐytPc\ *I-X yoށrrޗgہ:-9$.A7?7}|g\Q;\Pp -RWW8?5$R>DSKNp@<؞xuPԺ 4]uKfcILWƤNUGH] %?<6W.} Ts?PP~vtFEC+ؙ(!ujRJOt4Lbk $y$}s{8;b֡ȭL& 1=rAA1ftW)~$M7^{_ *9b5\-߲u$~twkE> P?);: NvuNxo "I,@Kʝ%IbM+dcfH.җD8Jx%_L{^DV78ᨁ->c)q ?,s^->͒Gocm6l7/up/PŊCiYR)gjN?W`OpNڒ;N6$u]'1-v%_oD'=J:tmgDʀuQ)񶘗i-LMwuk٦ЂLRVRrhd̐z&Ǔ'L`60~/(8 OeB[ Mr,SbɭՖ2auo{k1gQA o~U# s#P >B <0_ZD^4;r~IP;se4Si$wa,LщFk}-o-ڏ']ZRWHf+E?BJz[PVJR^(J EHZ{or&R?(3tFya/$hv1~<5m$dimあ a䒤U<*UcM-PLV%aE[QN폺(y I'!Q\4F' Љ!xHf{(Cw=tCw=tuZʅ0zg{tiK)J\Y٪ bU+W4=DB:N?.jށZyY1s^ۆ0^T# @BrԪ$^6$3Je+T?}EN2i}(;*ȱGڴ"REdd "|wWyʸ2,+姻E Du7!|\^;`X( uu %ŮI^-k:{:{b@ t;6 G~ 4? !kai&.ݥA}WKga6OɀeIq%}x8{oE/YFـGMO!<`0mP7{DC yz`|+cD<mi30J\) C(bJn?=.]֟o ~iX.>'$/}%8B{ʥ. @D!-TBFf\Ew}.nw/wW[R6 @LI-*BUAzɝ廛[_= ^ ]|-"ްO.$ -I?LӝigyQRKj ׳~\P  )Ff_AWJ*XFg",Kllg뫍SU˥~o>aѽQ$C"Z}װqQ6>5IWNՒ%잫/%@Fۑ3:7OlJ&?\OWJVBR6"ilJёK9xԵ XSRSUr?kjKvlY[e "CkK+n;s?Nhqjnmvu+Wh[C QtOaIA-`ǶMg[.]J]7_O~6I~|NL3 * bɈܼwvw zO s[]#MmO UJme- 8R̖+IֲVZ((ǫLDG݊bZ*%ev8٥R:'t23.oE?wF٢BYinFҺVLk*^F"65oNGa][ YO*u[?o9)yg5~ĈB'~O \kZ\Cg fk$3h%/Vz=h;ie,!T^_Ԫw)oK=Q0ɨ/7EJJ\r#Pm FJe͍E^d?\0>cma"-T-3ҶB) PO7`sr\\O6CXf[ f*U"p1Z?J~:.j0c5=v#e%=|i=r<ԖzLAƶ<Д'Z> A-8mF{YoA'E}AnhXI5Vّ; +4#`#֡wN6Whg,,DyW%@4WiQu.PW^( 8?zп_;t<"sp]?O/~^\]t~r$ ;j]V-tw?7wfW׋{or!SP(j WҪNJh Ta 3KȮnlq(Kd RqX8$t8kO%J:"]{X~w{pIiqSu~+}怱.:q;ҮzXwn"o_ne}6ۧ{;H6*=Jd)is\m%oVmeWh9ဥ:,?//AZg=452u͉-CDk<ˈ/y|x{ZuZn"չ2_+W=Hv1_$&kJA`,᭗ޭQ Bf.[[W S\NḊa4@ K[jmMIAEۿ3bi# ZwV:;?;4&e7[{+Rĺߛ}}v yB<^Iܷ^ޏ{ab?ݮqe䄓J^PRi]XWxr$]2{/?rZ;Ƙ) ~idoxhHjZPkW}6;f3v5MRFZ_rgB^;OV<~iP(k:ai22;в[UvU/l 剀>rIbw}{/;z5LF_{d$ _UWɈ wЖ{Ju=l;()w[P}"3 48k+JY!1^L8|c3 Lupie.Aӣ:hB-4 Qg|i 0v:nmçov({{:{}iTku㶸۹uRwi_R 8 B)e;&w{TTTHUHa $JV9{崹Qt;RZܫ/AN_3}qP̤}1d+jTU beJ^V2X52#6֟2c&| -!ĕij"UKv~Hc;ɣb2fiÈf/i2FjI]7LNPtOa!qYJ:.=c.=Q>4((t;( ^5E#&P  ^MMTBwRr5{+(g^cL@' }I 5)^G`a2Ѭ\#~7,/€^>||~]O%]-ڏdc^U8 حD=QLU?]Ar&:9 X'f$j[)EV( ckJ)۲VhRJ Ѓu:7Sᤏ 3QSaYzl1cZGj:‘}sHDR,U)`ȾE7wWmH#+-7{ռaY[{>O%b$i<0p)z}?Hp%u:ɹ > {]@UGb GY"Wr(h`y­QQ)i<OV;id( Њ)Iq9eFcp j2 eHw8r Bu!ÔdL@Nݶ}ξh? x.hKh{gn;A1bnE. Xiȗ> .! .n1a1Z\a@ )q-#clRS䋽NuBՔ!f:'4`JdG[sTn;H14+K¹1L[ư9-jډ:x W܁%1Q\@<<9kDج9 vA5ydN4đMWs6@->~SJp1#ǟW\{hrIf<)%;Gƨ|[\8@:)a7ͥ:"8$[[>RYp.M iDT2SF Po, _wfێ T{ވsdSOgjjB<=8HeȤ1FLg,(5QNx>vq=A;TNUjz\KpvpF.e6z 85,"򜋺 wCޔu). @U&X%T% wJ(v֯07 l,y;9m|M %- F3iڔ"m*Y4~qtf[F.wx \I6 nTӞrg~^yDsLD$KLzi20pO<\y[fǜpC"ƅm@ PmzzyY8GY8 gΈ^*Ag`nxmYV+B𩚝_X!V@gckWzlrף+0bϲ}QAESbJ߉6Z)G6׺CكǾ ~H`bBD|C%ШXr}xTPzP3cMz*GSgW>D~iM]TPT2e沸!+U8qrW~ HG3''|8BOy<;5S|M)mzҘ@ê)IdÀ)4'ו{Xf9)c>,,~N1:723pD KJbDRUmc EvOl%/Q-@Vy ?{۶쿊`\f=M#I\4OKHwlS&% 8cɝp3w4SR,$QzusQ7W 땄VBk$;U]ku7Wk[HuZ7H.waS&R8k5]gQ=B-*͌<&Fmc,C6UӰ 1NajqCԊ76qζwl&jP4?\h:a/F$uC8ܾEکeor{xm\m慴$}ާMIh})o~9xC'wzgeL)}~۬۹/Df{bDж.%" )aBRkn:\|T _4(531>]v BHLĹwMow r`lSm.f )im~) N=mْިtF&8zGuĦ}..r!6ڕ[![i8y]eLooKQE|׮;q=uil#{ȋ!3uDRdu}tZUsDl3;׷Q&0VmV\mb s7o\?֝4̊r⨼ƌD_iHp4ٖO+B(im>C͙no"kJjs_)zh8~==eE' Жm\|l IF^,7!Ja-88_ Yhw:a:\nǍ}w PON.`*6/ĒNO4~g  x/C(ϧQh Ӯ;NLvhh\ }:(&~MrH}Eb%` t~xv~xt{7gn>?>κfG#3@{a ?YI0w{ROc_֖Qf_:xG5C?)x <2\Z8 Mn~ 5 LJdzl2,^v:k;~< `= 5/d.}}1:fzɥo?o(>YG՗y_>Wp]}GR[/\}θt/ҟ* FϽ xTr8}>5 ]; wsr/(, T|a߄uIX~6==xG`ށP6PluDͳj&Nϯ.|?? p*=;{58/9K@gW?QYQ;(s j1hxڝOn~?J7(J?qpKExbsQ0(M; \g83Cp<+ƒ+ΞɄP>;{H~}իЖ0_pūIfVxt; ]?T\mLNo nA[~Fw8ۙlz Y&,4Y]o(YLjQ.og3lїeBȬe'Ű/'1H y3 ><sQiPn P}%( z9-,}}h(Jutf#2_fDvr]\v?N>|h.^3Q-Qߩ"UL_a /%`7i pr,u8WpO`ll;#˾M* x<vp} ޔ.QG: W?T)?_'I["ƑT ˄K@3g*Ōy2ϔPeEkD$I^*] &L! 8v`}!u'xB^5Vp'.;||8M¹'Ϥ[)98-oQÿmQ~[|<_+Q3˸b:c&L9 AJ(`>GSX)-bdy`y),+&L i CfN18f;)>4Xm(Di^EӽBUE=+Düyuނ^Ǔϟ{)˜6#oĜ5R 樺ԡE6%9k)ӆ fc_m` c~5];g/O Q'3rO! S8lZ" 7IP}/P!z`mbO18uC\t0΢Ä 6^B"*3VXe ̛N?F%x0>][Ą:b`[VC c~ /פ=TinvWpA}>(D<{%)0׮讫Yf?ЮUPUQpZ;X.YkTb[|}|׵+quu-ƭd֛3s u\|^%W+^劒J<10t# я%>y50-*|; BR VK0\pZt^F_s0<>74tgS\eLnL0{F`>DHdԏ)abiϵG7+AuK87W>|_RJm+7G#s@ D^b /1DG Q^8<]!&0uXʕDed)$%,dt*nF )͡Ĩ̎|:p+Rq3njrVF7-XTҲRHHB@X@X8#-|_ p`eխ[Q-խ[~芪 ~ȅ>Mc 2%2q7/~}ǃ@psՍA@T0A*5 ڜQvoiƕSryBe L-^bT)FJCC̡_%aݱ FRXlyXf\Ln3Sca"'lqVT֭zM^_ i1φO:餟Ѣ`w\C0{'H~8{aJsA{w4^Qp%4|U@㵔jm/H {iva;i%RB"OQicl~jxim'qwUuSoSE^l cG 79%1"V@B BIfdi%K#(YA&GP Vw̡bm~ }i8  A$cZAf,Z-!d#S&DmD10:!FtW(P-8jp3LȨE4-N`F'rp4cYY 2F)r]SN#:De՗MR}E(\P#>&D"`b&m ia$6!jF حbo-݌aqā2N,Hf]k()L]%"7 &1joƃeFp!NnARcq""r d;I~DLsր)G bE%&1A@@-c`T 0ʉEj۹R(Qa;;dzYVQi7cҹ'Տ1d7禟07c?   ˺yhpqBR>BL)vLȴCyNB߇[ m>'O=)^tDՔ<2T3SFG@QE-iY(&c AV~)io"ˏCEKcV>wۻf7p#N~d`*c+vQ& t+T X[9 yY.o~Ӵ^;:_s{^ GCy)̻߶ >J3a~hc}L+QMUP=8}D#$xFʩx,7IβǣU<G ?6srEH*첍f8xCU9P2,{,ίkY T;ho$bQMY-)VzdVøe>ػF*|zY-F>EIMkN\&wib0z')n) LB _Cz4ـٹ2WeU}u ׺Kuų nh P9K%HlNSt1#:Ev"0Ui@M6~LL jo,d.0)o]K<| UmJfˉ0s!ģQVsY=EhE{f'O{p&?.޼mv4 /SŊgo^?>~x;8|2٫7Wy{fQكt*)mN收0DiӪ##wPq%2>Q%r1m, ~fy %V\?zlkL%5A:wtKCz#Ŕ 0bW6HvcMQ.nnm]x};ۛDʩ+GmlihJ1=C%Kj9ϵ깛V_g`֐g;X]C\cuPhcrN}fx5~m?D31Ve_kM.g4жx3!]_iۆ7PxwW_ |HflZb@Ani߁ͼ-Iz3ׂElض9:}zU苏^6fM0C¼SPqQu6;F/r,Y{oo__yy\.J̹O ?R-sr" 1B3Ȓ{H Ib>( L,ئ*~w^zaZP2ǧ_i)e. VVBioK8v$+۩ 47˗n'мAT} 4dp? >q:i Js\J٦{w6 xY'K"q!ƝrTN5𲪟O>ju@2L' `sXeb#x"&%.L%f"!yŀֈyuLpS[a\sBf5YAj%* aZ@r6> NEwoӭϟ)%bO]EWhy5&MЙe߂3$;ޟ_^|Ϸ.~۳!2 <Ȇ&vJŘ-LwP嘭gBp,.&iL#ZS0柛6iQȴ̸z0r# #ׂ]_sau # # # # #gadq^0rnRb #1]_E^j&RYs N\䣎@1VogЭgc>N]wЮPbe3:?"+tdAJrrΥyK)>wXBs@﷑'!W ~1}Si, :&]uR:˪EHe~C,(Y(j I#=6y\ѹуX*zm)zo nn'z1N840hB4wѻTi"ةk- g, !ՙ Yl7D0Ip|I(qSEBf[CVl ٠+A5>ı]n T&P nre2O/J[eMྜྷV'` HvkP~{hkIoR8qɡS%X:g I)F XfGhBQ-í)]a'0i;Lt)/93X_50Jm`‹#D"c&F@:S!q%:ÖI {peFFBBp"r*9'SUtIA&ܿ\6ca^`F"uYjΔ!sP@+ Qv 04H8iDp ,qNUBpLI9 7ംs!4ɼT<40WjqQ `QdF"T &z :M3b҃uZ-ΐ6ը"X>b(ǽ 0||6̊qšR3*SPV"pO.`/0GPqX h UhvZӋ7He.Ufg%J#ijԃmG0ɢ1HxN.8e@\@!ȣ|q6p)Ė.97mU [Ѓ3.ֵf̪x|TY250ySAV04bF * h@X#Z(bЎU451ߎ7ݾMW+ZU X\ S.J9V{f5DJI|>E1̥:,HT0RSW-m*A0/[ 6 u wі> < MDQ>N_ CdX! ^Qt/#o:!^]4I} f)SJAh!1M!e8HEp7;p9( cHdmnC-Xb)1;Gn=Z#{hr @K8 Mh-F!)vbLp٠܈Zpq0ʄД5Ɉ䎨T{ ts[W xkk*(#]@ۀH'6B yG۠X %t7Ip,{6"r 6-MI8۬pI7Qݦm?LmDEsj?*A(,U<@6B[h9׊dnCaŠr8Jc(]Iۣs7<[M.uܸ'wX]elԆ↳"+B#:ϊ;ptȍF2fq"RHF ذZbtΌX[bZ"$TnرI D:tBr5@b|PӀx*;2+|3qD< \rYᷩ1ѹ`KC0FI, RH쿜`@Zs5(%J&ԫ"Tp <9[cdl\SQ|! eEUQP; /rAgynűRLudƗc05f\lKHӒhuF>#m1ؔ ~BrFS5JL [Mdqd""G"E.(mTmޭ40E4fuCt`nQMk2Ӷ஝JOtHl)O?N ApnIK? x: Su//VvfĆNNv-8p;­3\Z&\VxSia]-OBN*OpjG<R*;;c+ O Wg |*`^k_-MCxyEA)ŝhP ŝPiR)Qp\ˮ:ޖ^Ն-I%GR&5xA©-XaDn1 ¶EYI.H޳ + ! VUG7)N6^6 PG)voCx+Rp|}8[f"t-^lZ6d6.H f./[2VkƍhoAp] y&EWZ|VSBx [NKy?F|j{#+L{WT30}ͦYqؠ} ؜kgRx6H .M\? j~1{$ZhA$Xd,:27ߤ)@Vv :f6]@/Raq-ŗq7eVCWG+s~?ΕsW#gSWo/``g{yH-aU71HuwVvQ I|R iM3pd-ᶃ6_~4jR nm e67Zc. yV=KKbȒĒJ˭͞45S)&SgŗꞡSǨ |u02 X wnv78{>,_wHA}ϋ9{:ώ\Q0b1Ϩ='}P=n\dߺ49UFӼɲ^E)SVҍ{{;`YM'70f uαTf촛ZnNu\gMs+ήC}$ 0*vHSw'w8ؗCgh|`EwU/fLծ|8| u1>}"$ξd7?!Sc8YWV|YҒnUd4$ex*Iiק-.UP-v %׷A{ ҹziтHס0吵YɅr!G C UsWB|:Z;;sU 60C#.VsIbe+_og5du]>k>Fd,N5Q6?*믴{wܮ.xGw-%z* BcO'0v}X t <m'qFMgЧbnT|*]_{rI 뽔rɅj׆j KjW-rU<'-˅9v8Wnf]-$j40KgA?sjA#ƹ/ w2ʑcxb!(-wØF. EVP8-`X`]2_yBr[q̐ׯ<8NqcU 6􍳛ئ@neKzgܾsd8+TI JpZN,?(&BeI8Q>gs'XF(DK3\ Nvy{7/frtnX%2_1'NFx3 \iT=&>ق0>vj^>)PXgyV%'ƍ|栽ք!c!1nak3l@֫[{SYFwT 3(N U3Q }JB6[XYUv$!/\D[Tmʱ?6`5+ڭ-1))Pynݺ.njӞQXnh7/)xڭǣ%=m`>MƦՏ3IPخԢg*Vo]U&|D% )(y[FT!~Z<&z]jޫz@R8>ő=u).LI$3L-񖋱vӿ/?L)0]LyWh%*0)Ʒ4rؾannijg=5-t<|I}˫V'I %,V|IwunklGQXL>Not9,ݛ=mx jm+emv >V嬁(|+!vj6ڟt!'tԙFLsZ9* KXsn2FAy0!|:VI̳wHvIL+kVEqf;NQ!$:C150Ś/Uګ<&5Hy3Uxlm0s*\DҐH_uKF8R\ݔ̏[I YxQ-fE;'IȋJ_[Džջ\K|l&rvibR[0y3\BEFwZmtZU /?މmI\Js,J9hkݎfO߃>8r|~y=).W 2Utzr9N1s 7ߪiظoNIUb俐uaw5BD(U+_2O*&ͮ*>W\U_qЭ d MK2}5켚1>$|2I6¤0i*LڸOڨgఐc:BJI#Z|N'B5LR$}b mﺾ=~Zrz[c#l-NZL=mb2WxRnQn܎VA,6SBH[q̐#D% 񁪸b[U^ . D}86VUMDrTBT9{[YyR<)1|i0ñ|iҲtᠶXb,vAu.Lz]>Q?+ _Z FoN2)Ϗn7leΫP YoNjv9+RsRł󖳰{|YP?3+C`?XQXhHA[ t$EwL:gw0Ѽ5x8*6; xu~:g<4eo;8fEG/./**̃I TDΙ94Vڙ͍]k+rd6INI=/>VefMDJ.^]JmE:CW$\M* 3tS(J57]0Ӽ@k$T`.y x:6+Uzyv^(MV fM#z"QyVM;T5?ϝdYȎ!o*xh} >Zyǃ$ꉫM[5?>+̴؀VGxH!oA]Ɠ v.1w.=6v\Wb„4ޚiعU:@[MZH6u 3vF%c7UΫxeyK kٍ|ibx/T~xv4`|r/ .ŝLK=0Nɏ^^\W9pp3A)-,^hΠO/,89TQQ8G^W!M{c蛳qO:t?nΦ |t>go`'þjk6U5HLy6suȒW~ x,x˃;sPAǃŽh '>{|nPrW>7&GjD0;|mBi)_!@ɬ&гY#5Yj%O1ɓгx;[9YQ6DtK+ "~0y0ǣn8/F;|ޤWanW27i;xLz,:9dŎ.?w{Sb3tkfbALHRI8"IG$2Լ}&y*1'ϥ!(Oefq\^:LR%ı0IuYvLF,ץ maj3s ցUͱ{qUv Tq@JBuT MrD乄=;&'$-jNSУ;.pJ 7j/:1aP+Antx}PHvAaITE " p?@Ge^yL_, -|NW-̻`@$$pjw*CHi1[fVBHcs/rgr$(Gn+@#  +kvbTB/3r^+=/7ZŖý̭:cB>KgPpKoV3r}ysǭD@%74k$ز˩&.2nbF3͠u9xJ R(cS ^j4R(DjQHJ벻9s7 7"t|s59 RkMx)9gg2++H rak"<œ#!-Xxȏsȯu|-/6>ߙ_ӭ{K8H.iC'b):\q3rM77Ev^SDKJ$$M  ⢡#O4 SYF>aOgnD.flXj3 I n.hcOux,ZUk}isA_PZՠQcps Ѝ9NtISreq r2j(mt g׼c`}7TxڍMI+HbvDfFd?vc؞5<-Z!){lRTvIJm/<" `~˻nx;hٶˈtعL+:e@m~%|~v|1> RL{z-<;SUs]2h>fo֩y5W@ap<'$bA3 oԍ} c~HAv<-^~x=71(-iVNX=5pwZ?*u8PB^eus9u}+;~8٭ɗnzd8xjΓ"ڡ^oZ1̢w"#읊m JTVhf9%X?idSP.,8 "8Uo!c92*P!6JQxiA }w־3?ǃQ9l1H(RNDl:TS:/ꭅ+qc!ƍq?Jub )t09$5vµP`c}cBV JOBzf=8A+`Rw&k1='bzH1si-;Z}rhqaq9yفX$mg.N D;mۊ3VW۶aC2f=P[w?ŝcPM2OYkWsY ˗sn9 -:9[Mm`Zj4<׋re/Hud3kgbk+uhG+mՀV«9ğnsBPA}Nlc>' [h߿ -,zb=iZP'x7^J leεxy":7-w!gL7l3Yۡ#cn4˛ Λ . .zAX# :1+}͚X4~~(р6_|I z۷, %{ʘ;UMޖ.:@u̜BD{n(\~8E#Å$<>\ {}@Zm!08ux#rzomXAs~avh7֋wvdoF0Nv{8h3z 0r;Y521|xW۫!K=@0euHjKcKvм u039|܃Ww7euEDùC`1^'yawMVk W$P37n}OUzP(y5$Mzr.Εe#$eRJĭ-wsLf|~== K;«VkQyacWwv7HS4c9diM&B?-/CߵT._! C{ {h<}J"]\""JQ,Ut4dWaRQdƔ܀"X)9$};u쌊67En b42##h}sOߪ^?E仳?zf */m~>ȓ#8ݛvFWrޟ;_=hEsߔ]z-(,#}J YhnQvjcWɵ[&E%|{È-ءy\^:ڀ,(zA@^AM{4}uƥ\% Uej csݳ2*Ϥ+)&qBVZTj YrsanP=FШ\-e`#%&;eal?bv4 v!߱h#B`!$tsf( Z)@t[AD͕Φ<0ؒ"&[^V4ȼ'23ʇ.Zψ(| )䬲턑{|3 hc6Dns䙧o Eza8pr"xNlD#dj$ñv6k NNĈVl Z_ՅM#GVe\.ZK<3`*% :%Ъ͂DC^בЭLr_ qw[='@a %'ȕ~tSƄ]7AgB$czK^$R>:HhvR ^.jѴspJA욃XKȥ+ `"bM%apv%kH̖k[…e m/o͒Z\ 1\!̢[9KAdvA19kF;]?zUvW:;ySs+ KxԉԋfqkZ4lIÖ(`&ݕU$`'RE1y/N2XyIb ܳd2c\b.A⻠"|s%ju(.x"]vںT,Z+b ZWo X\4H-;>oڭOz^^H'r m5dy__xFo+npK jn|h x$a`n:=n wmt@I&bX翮 $Sz]2:z?Kَ\ScZK8Y.m<;M.JCUlO:㝁WvgU3j黎q̵rʹ(&B*1FΊX{z B6p6 M=]6Y5v'?,Nr\-$'Wzʬ&ScKrF$lx0Xu̥3{g !=%ʨ`GuTw `ق_;uAΆ]$Ip?tyv'[cpG<Ș72 盩mxhx,8zWwWċWIx+{ITUx qޝNI980.`p(aXӘfZ`RB1fgQ[54:690%)"x0$䊩X`%V 'f5$'ʸ&U6#dY ^'q `a{|~ռc+RÒ؍ G_(ЮoF9nl[^rbw9>Ǯ ȰϘ`sR,7P2Ai#^~g3͞w? tpY)2P bG,, fIM6pB`enp j<فh|Umz/ 7*uc=s)a|\.~{..]܆wt1@ N]¯[Pi/] xU9NUk1U# vem}BBi0\VО}5@)c8g>ce\>Ե[9Vpj;+Ƃ J*zBD%4ibVcfP޹sbvHL?I vT#cD!!Q/t oC7ă x|v{^å~,؞ !$F 3j|dTm"g$ ?4@.bK#~K@ 4bٰX6f4-Mӓ+i]!m lZ>g6DX: BHt)P[0x"4zm!g~=:0˧o"Q c99Kt=:&,Nd#p; ?!vk'$&tAIK( H3&эYcߑhpռc< R$ XY=zta&Ƙ`WclMFxc&<cA7e!5M[ *qwb%g~ayw%뽧Vodg:09/ 0 ZĒR%0ضDTĬlT iS3wcefK/R7RϜ7{f bρ]c92\ӽȭt:T'շh:l' n8E\] 1i3(#iD4Qb@qXs}ez8]]_Y2F1W|~=`aĸ3g?=#x| _oƩ|S{,g1/IL"@@QP" L(do֎usyB]<ȼ#HivoY{U[Xg%yw{!IJ[xSD ;}oUkH&m`@BS;v֍ܮ3'r=^=`KhgR+ԣX0돫ڙAbcj.Z ޿d/7qz؄uCK4w P\̦:-13KCã屎/pQFb~?'62RyꪠsCܖHۈ9ENkO:9 _ qk ҋI@,SWc%Oŭ.s^8$41bؚn⺫&`)y߉m$ K%{\|NcQ& ' ٞH㟝D,(Ę *J(1{TA()߭ibEOKtwR(%%h 7ƣYPx f&? ;7gE Jzb;:OAt&c!8S*[؝X),qw #Ebu IwI`D<|֖ʱA^P }y۟GK9B6!M{?PօNzv06 ]:%[^CLh]?=oQ # NS"l̑H#uq)Ji,.ϭQTq+'lAe8-Uh5-~s׾b%Qڋ?t@>*_$w ^Pa=oj@X0dq n} gaC8y%G`e;@񗽳$^ ȍ*>~]:zb>gDqRF ċz DzŬ{ NiŝVJxK nv@qG@c cN p#ۙOISm>?xP:ʊn9ay&YY*"i MH&@8K(a^m`0,#@i j@XjHԊc͈22 mZ`&Fdf5=/[v1F 6~wN :'PM.w#-v!s $hE-%8Q"a@,Z"q"RD&R2pB.Ρ 6s%i!lV["w`+%vNRkZřD:o:x1x b H( Jmf{Td,Xb@3aENIѿ-?s_[,湚S;yF]_z1@Fx\: ~Zty^j5Q=pF{:ގTH*4QS;`WuD. vF?@!wg޶Am7툢dсip]u}tfҢVVj#KcR|Bg)RAvn8w^.j2zD(|K'P+b۾,'&$vMZa .gIl6^*_ EHe9k˦@,^A/op<C- |`) |40fs;s [mnͽWwD{X 19eWp Nk p]z > r 5`h)qP5"UD'(3Wdfr ]'$#x +|*j%O)Hn$ T2+:q̨H!+td3Z"6n@#T0ޮ|b=>زmj*|z6u%uR1z ?"_7߸c,Оlm{ɄH?}k+57?>erULөY߱1J!oa9@R1|LEtSBr0bywJ0 j4/08gYƪgt<4\RcW_B'Ƙ @fUXet&rمh ${fDS.=V7-'~ؤvF3'AP)vkB*)rԎuM =>G%au @1F=[-mN^rHzF{U+pMem#JݵӰ62X%|.d!ȣ,.q#YPw]\2 0`~2yeh"0zX;I5ŏs^ ;3r HvL駶}5=R쯯޶ $p?t7?GNat/&J[r ׁr1ra2[2Z67%8SHidDRE#hqiH7J%Jt-4>)M ~]IV5x6IgHtƾZ5]T\+` H.h' "9=j( ct`;Y"?OXlGY^HKW {tu}e|\揫G?>}zRpHLm D#x|L #h"#;5 I:aYRRe"QpRR$<aS*S X#b}#b8W>4ɔ04P4S*n?^_[ӧ剕jb0 9qRh:6r>rqLVFyܢ :EK'3eQfeFV"  jnO4ϤQ`^y*%;罻BxdWٻ޶rW})ھp!i6۾ bw~*^5rݢC&)u9(0bX:p X\&C)[YKK6Z5C\Ǔ$[d{Ce[t6'{$+{*CsEh{\@#t8wS7HC}z?ל>>@ w¨]D ?Ї%A &+m Yơ`5%g])Xv]bIFm^ƳUh]9F|ӿ]GfΖwcSۙ6FTu$I(XbMD%Rj@ %6NeHD$l [SwlSнhv!0"xa~fϖw쵐p)@%J*f罾xo'$ثxѡ#S¿@9>U#B5$Dʨw`7lTg7@"Wrބ!mflyNKOx+Z|6H3Yh0HZ:9t0XYbol?7ufl&o#TKj9.z9gMK|_sqDdA)TUaIT8;eOU=݃nUfR썒uqޒir[XFc!qSz>6Haia$tm7<`5cq;l9% -%p?bNK=ZbEL*Eeʇ7r²Oi8z c4oAv=7zhTb  t$EX0ⳇFEJC&[BL{j2hMx~ʻϟ:6+@r/:4 kAedüM"d6Yv>+nV_ͼ*]v㥊`m)5^!߸$x~JPĺ|~R:X+2RH瀆 ZG:,dWJNmdƑtԙ_`%Yϴ~[CIaѸr~/?l~;/(~s}ߗLjzQΨþd7@!I5΢@b9tvtvq~>BϗM/R¨(A G1*IAm>fKcG"kp?G&t#Pl?7 cD%ex䖻Iu@GL+ԁDh}:v @!7X|>l9&mԂO2_2QKH8MZɕ[?,~LA &e39 ^ $Qf;0g4>}WVM*2r$}^TfqQr rr"뫓v▾ϳw),B=V/Mh*Qe+] @Ok'#tāuZzu o)Z!P6i istA;#4Og|(l=-#@ZQ]nZn27=ԱbXtmŀ-A| R4JHk!}(Ez¦_DbfsunFZ-Pp >3NNkiĞ-gUԌt)m}aˏIbI|*i(*7Ǫ;UN|i.Yip$H E5R/ sC%RzXQ$OГֽN,f!F=rym4x6.h@NgHMF 5 t %axK=6rW-l3(9̐"%rqamcf%߀l]<mP.p- J 6O&jVcĬ`R8t161k^#r6UF_t*5FQ*&ݰƀ>)fh+1 %2{@E)+n4h=at lqP L" \¬R(Y؈ْHY( رzC6#yFp} +eiuJ+Q.x1uIe7:$i6<@g]Z"m[x"댴SH풱rJ`poQE>4)q/x遢N02YګK MYLlˢ2SNI`–3!݊-}Zr#ɢ Xi,_!,m p#gZWV펑,">aG&wC* 8j -]AV!riE  1rC]ZǫP]%*=bڜIh`[JPBKJM2KgqCkMUJz8bx$[>XŊO?/o->lxuIr/ Fš\,ER%*a)7;"Toy޵Cɻ5LGEO{a0d,[UuBb5VxX sBɽso*2p䈆"ϊ <&UHHmʪm֥[$Mb@% S U7Uܯ]K0|v~1G,!W~u'= OBуP|8$؍h? b=z)Cw_w^]M~^mZ&z.=p( Ojk:ni0J'V#֪҃+x'^Fce(]oGW}Y([8`qpD9$G_ II#{{Y٦3U_uJed풔[q8h;` <'kAZOg]8wx /UOz,J\&i@+ >5[ X'*0t>a1_o<h),0>Z<Ά=yv,q>՗yjU=︁<BUQS!gZc̃8gw4xcv) Qd8FGh9:*ޑDMDc̳ݨ%c6rq!lcvb=@Ӹr&A. N[Fl{;FlK<˙Úђ+]=>~SX!x}yq/ 8^o?aWlQ `|Ofͧ{t=_5L≪1򴝈uPj~3Eu֋rj 6T>Bgŷ[w18(]j謃Mv)Zw$m4=΄+}) :un]r{bqSz3N9yJHNØ"=︁< 㚧B1ETgyǨ\ѝD~w2Fۖ6̻->Pr>Xu_4Wf.$ ͒4*>'8 N4 >tYhlv`!y!+$.jD@=GBV&:g3 '}y!pl YS.^pT28ݺ :BEݨM`"PCNcr{xć,|n{gק-2p!8*  FʑC !cQQ ZzYwrLAˠdrQ} _(H= ,^h#r]|W &1]'Pmz7{Kq3`]gOZbX$3^h~]xǗ't"2ՓoW'jX닋îSJfB/O._Na$JPN&fl>[?8`*)T{'`wkq[Ʒ_/O.bf!')40u7L/k_ˍu W?{/KDt5@W\qbt5nt cx@t%.GW&F7Uh]IW\m=4n)qSND^\\^=fVeYy"+"+EqX%( TO\rN c{J}=Jԡh UvPDPRR=`iH+zQq)>()݈*Q=Z|mD9IkǺ u#@n}?Ƶhz?2q7],B=r5կԍ_ZkR!{dlTB,ۀbLJmٜtց "m28&ꐫ%H>-_'|jI\:jz'ɋjq;AbF:w~ YխL?6em.1B/c{Rܔim =E0ᕕ^?~oy_I#cL#ejHzt=S](jԕv衤=)YãXڝ5!K DmodqZ2FoziO^Q1M~zCZX밦jX4ex\(}0IX6?}V>{Ak`혫?nT֘ճ8&Sa87* 1̐r {)0 XrB7&F2.\K 3 T!LHDԻI#*e!CrvW9-`J/ QYaJҸ⤢jwR4;Vp.#}{R 8(=F] -t[z=?.ld6Y>.X9>uqwW}xF)Jb]ŶZj]dUˊneG̾nm䄅P֗ȷ]E66%e팤,x]b?\ "$hXhncE5OșEyepCcnj tL|$_<3S "kΏ>^R\wvvMugTͺ3qs#Fu'52hdb4%sWT)"7`ˢD1ح%qG0i(<xv8:y94.Pi`$hEׯ9)̆RtUz?=$핅)@6jQb2_d+'خǨIP`0y4!`l<0gFĕ@WTW 96%n" A0{d=D!̜FFT8 MUa,* V;wX)# KǞ0i}ƺ \[H# 0wԊ,˵S> YU $RB$h\;w|W{R`bmd0P 0 R:(W_꬐M$A%(G$]p6/wfrKF-OTrkl:cp:s%-Rd'QۮӼ_g9SU1=(nُo~TVJ1&gqgOeOF\nWEx8[!Yw<+y}k|xF{ wNp2&T~ףDVpGc(P"즞[!3ˆ']myBXNkO/VXeLdc]AOe8Bay側.D{K$ڽRz섴 YaD9H/8Y_}z|hLepY- Y$]9LvLX<81#Z)ٻ涍,WX|r_鋫0e{SlRkg&h2cIJ3 Q Zp{f¢i "L8U*oQg! & p?@鷩Y3!q?5?OueG9 ~y7/m߳6#9wx6YԽjfa~v<&_nr;}jˣ7ΰƣY~H\8^|NA?OWT敩l<of3wz Dz7x=gn 1:S&¤JI +vd7q qTps_Y wu.Tj eْx0E "(!|Ms6؝& `.г쏬BX=NHp@3i(lBl*V/auW 75soq=32:Ec,h`m >p]8@,_5HDL;0C~ ]$(s[;СiƫcEtvnL% |HMV v.%0܃'ofeU0nA4Bࢣ:w5jztqt˃M(?+жOH Tz T:h)BX%}?a;{^݇$amftJЖ'=DYДXHG] 4i#t+:x?T IUP in^Nrxj{Rs<`mӧxC5G2%]pzpH[}_bhL?u0UqE/ 9s>iz!w꣆?A; w͊L(Q@Z7H:^_9pሒTaLy EYUQÌ/H,CQv E5oLӑs z韟<<nbQ@(ifiZɅHc&$*ydžq"7M  m5-C}wg.4T{s:2։oo'H)gWm) f)yG gJ+`ڴBdw=BI & `@^\?|zަ7vo3XL0# L$L$ӌ6Oݧs`~\|X+wꞓehXf4"֩2&vF J"F0a9$Ⴇ0\̺wg,vi4p H_6DJ2~d?^tfk[Df$N"3Af1rBFNn׉uRiXXC灻sq99!>\'K;EdyṭE_ $n/nfA|ṭ_ȱm-zc;zEz&ojٺ_ƉaV3}?CP" Ed[*TJg,A3v_fxmw~CN= {6E56F߿,gZI!IcLvvtʫ"4(D=[@᠓jDb?9B6sf ]bv PWC]E';厮HwDLY*،9ejm2n{W8?,EA$EkIu#:Y bOƓG~?1\1%輖Q"Y ];-kkөh-{NY߮nBD!Ԡ?.kt.ph^5Z1 U`Ub [kHafB]eXhls Pꎆc\cxf!TTƑ"E"KKzq,Cw"x^;:dmטcVwqDDNE*U4Q LLڒl>߭4vL'bm`m^GѱCݒIE= 9ܥ$m֧Cp   smڳx#驺Ol0$|`> 7w : nMyj[3g7Iy!3 gm۔,;/%+y s_9P@ns8%Y"h㊵"R&I7v(q$ok[z\hֆ;^\IZ3Ꮧ]s7 mMgS74@924V$/ȪQGl[JoMJ q` /K/kύh:4">C% m$DҍYx&rFz;޹Ԧ~~{gM4 چrqF-D 8@+Dn-g@1>\4q\H Jwꈒ]ʐ穥nQ(A^vϖkvuaGעpR)$z 5oh]D3l5/R;;E 9}]PEO;N t} b8kGJO  ֏pjBqdKzHDŽZ`s{:܆WD{Eg ::ݗ2#ϛIg T}pNғRzzdN12Tr&b D1Fd*fdKo5DT[PK"5ɷj4?@٤Ҵdz}宐^?Ry nzgٕ_m|p+ Ƽ ^͐ ݽ.FS]چ}xjaNv!kW(BV#+L'^|Ln/P-PƝvn]~͞Q&藻gv2~~_fz>YQ%m 1z)py>~k'NhYAdF$2q^Y1<.cD:AK!)j 6YA*DJuʴY&$QI,C*,ckٻ6$W|؝hC|p?vXb/N kljdga:P*+3,CkO9n3g4% SQXXf7#$|4y*=;cX 4j$ф2ͽEHSivia#W-ac@Y'ZjJ13nk@9PC1E0@N"`X QYq+Ŕݬ=D*H@X`:b6֞t3iJ8e06 4_ -@;XYPfwb}( ck5f"Z{p1[ց% KbTV\ vmlG38uq}Xê<̈Wg[X~.r_&;N31hp,{@v;8Z. _bM;tPo:9wl:ƒ-̣XG?q(&ߍIPМvTl:hΗE1$: AcX ` őDZAGgK,jD0!|}wMokiAUr( aA,Xcd)n"H] JGXg !V @V1E+ǘ+ XOp\[{) P/'ֿ7J:@? it;p3×M:?G̈́(8ȿΒQ- k/ s)"8 5 HririLBʼ/s [XH@J% J^A#VIn KEw?N޺/0i響w+ɗrzWǾQhUzE0}-yo@~~ O @xJ&Qt78[,fz|WNXc;l >upGߒ’%ۓOWDK lf>~Gw=Tj8g Ɗ ڢZVzüޯYwO#JD7u`bs3S HR-A~[c=T$ISnܮϟ7Iw?[Z5 R8m5i&Mm5xjz񯅅"ھשu:иN}}OpwY C@c 58PE"Z.0S`k:)KN~Ny_ݬ<1VBZ/& 1wl(ߖ;'ˏ?ˏoI&WAE9OD1nsM '~2}>g5$u`dv^\% A);[̠D!= NvYƗYͷs v{wgqDwrz fZ?qDܜƹ%nt^RC29$=_䒧YB`-&K([͙{VǦdqyLX7;YWB:T|9O׼%܌ C\M%=}.[uY yǘ!=ҍubڙ5<ʙ,h,p71*1Q=]c A <:BB荱=PjR ƒ"VO{0Z^FցƹU7fi .|#Ԝ$rR]ғ>)\mT>Ř# 2Xധ.u90Ye8S}Z“Lok癚SRteb/nkffc'ni4p\xTE:w)x &\XX``^1-R) , 1G=Rc9RJcq-ɰF[ʧb4ct,\:.ɴa|7 R.Dz!Ap(-ni,a܄7Z"ӓp.-N]KӥEKpdd ;%Ak %aɔ%l+D+L$ܖ/jGm招*,Јc˄$p7L6)4s-pc[-!)96!%|vp)d 9ή\>~ry9I\^[ִR?V,uƽPOWŷ70q[d^xbUui) ¡gԤ:u"nI 0Dw&WRe75a}8vcsn 4a7&wxuueESpz~7f:Q+b{ЯW@ѸVq7`VE'rw-ܸdk^B+xwJH4Ǫd*UҌ\2eBzuhL:ڣHuExHOW>/~h9O7x9x5L1SJp>!i4cHcRj8'Smk:AGyoZkڂn`1@)`*Zp@E( $ C;|wKP׏n,QkZM!}XYhn:{g~8#rwb\k`5ˡ]Q iE"=X]a \ql1Qv+ * "H .vyY̦D?5kֲ PJ;&zADӨ4 VvSyCn~3*yͷskA.X' M8";dQ#HoH)8m4OL]=r̀Ye|r>>rԘkn@cVX᫴p ;Z3VQbZb>Oˏ2i 0!>Op?Ϯ0s#/5`Q1FŽg7DUm:KUZy䚮x]Շ# ֒P!dYT2*ڕxxx%B‰ɮj=wwG5$bg40; ECBZ [ QdX^٭LE)i1-Ł4íKuG^ZiJ:7h&8u@>(+&BSLz :t~zn?A&=9ġǪ/ъC#6G# 7o?ûgNo/0ŖEX^N>/// ^NYtlGTi.*M ²g:pdsh# q|DE'8a,Z$ӕ N%@-vja>[LZq!mBB>Ӯ- 0BGӮ< ki;PnSХ4YãJxB9)g%X X+ +h+*ОhXsFy?mG^;a/lXFrik$; ?}}OeZWE JEmS%ӈ -ҍEu鼂㮇j-$g-f֜S'EB-,ҠIod C9Oؗ~XcKg2gBN f_WY3u=薛c^ ̵ᘁZAZT?OOUNäz bm)c~rXy_KC{|D3?v./U``ZES9`ֆavNNVB aF90R()EѶ= KUxE9SoŒpe-EC9Nc^H= rq+^YQDHQQoωL&5 M:]@o)̲:}y:wOM⩻.uJhmvM\gn2GaNOn@ՊGqAY)%+VpcJ&ǃUrP& 4A&TdIUt7t,y_AT2hi:g/F&ِZ!JLuwT6)l]٭G!/M{w#CJY`yLj!\ P\mAX jfp %-!pW4WZ.ꊝUTi2]۽npnv`赨c\ˑT#K#^ǰpMu](xdƣ!eύG5D((;O>-"P5k„}MbE&+ |QKG?lW4:1!\aX] c P( qm-iGFIG1LűF*&T*jx{FTxdzabZ N3=ֻm#L T ';RYrRS$Ct Mb!IHsz}qOq Zq9+ L2宰_žEi1.*B\#^/QbJI P )\Br-)%?ZS.%/ IvfK3mR1~FkiSX`1@B<҅=:('B.) jsA`Fg%kqel0G'+ *L>Bh. seYJzt0XabݗѭYy\L4>9PaG[" ˸`u˃M&e{7NW~9̟5ܔrL!fye^g_..~=eľ=8P3yyϝ}Gj4jV{Bk'+ sVu >ZrQ)YDS2Ìmf*g7'61&dzeYc oAs*1# |J BݬDAcvEũ'KC ŢՖ)K{jH2owgn$/u⏑lpx~2Chl Tqii}B#PZzPa>=mN۟=v~ uĮpяٔ2B^G-<3Ʌ kV3/+Bč[2s~xhjs9NOc[ ZyޙF폧7&i}hG/PO?k^uӛp85x~zsO韸X\ysww.nN,6w﯒?u=.ǟ_2W]#Oo~vOu Ĩ5N= y5zzyfD|s_v΃)x|<'2w OqiRJùu[X' 35&tL?4ͥvg >ڌ˟zmzt:_I DK-oo6nsO?E3Vtlp٨G/R` MÎuMwҎL\QOfptz wowjOzOӍ21>P_Fߌt`U^_T)ScUgjpr"rэ췧@O_R|˯ׯ> ;_t'7%."'~bB& C?EH%>* V`rs {FYstu%N `p1|Ǿގ4)F*|G'Q]NOqߠ%6I23Op4dHgrL=w^Qz^\t3)Ӽ!BO_ߕ 0aupq빝Ϧ(buX132s Ԓ!0i0Ydd0@9(vKQ_19S>s'y> [M'sOʹtef|,~w BjgsL̐KywCl{3V.ES ʓhM yZaeVyF!gHŲ]^Kliy}VP#\m&IUJ/tz)Jds1M7&SKL[.tM)EܞUz7zPDr4;rf>[L\s[݊AW2&T,FuʹHYDWX%gWNj3 څQYz+\KmVϥK,Ֆ:uhTjI(P(4=^%BT%iŊLjQx8 ytcAŪ%.Zz tmTp08>Olb]b:́ c J-&y,< cc0yo.^_}zk7^5IcwQq7o6y ֥Di\_9Mg/qnhߟs)&owM0KX 3[ݹߕ0_LA*Jyv1[IqWey"oQ3+BU;Da⸌@z "rF2qP6w_wɨaXjy̜ dU"eES(>%ֲ#f#^C]D+g,&7!b"3AdЂe좜7ɄaLT neʼL>@W?ƿ.5:Jw׶F~u;#N̍=o+s^pca9ywcd&&Dfrb:]?fݘaPtc-w{=c,@ج"s6n~~Gќ*2O?"0=mmʹT_8m3zzmJH[?FL4kA_ƨQxmJ}\3IgQETyշ+IB5gٜ[ԚOFk$J43Vi/z?~_y)܇'_=zYӣt8j{ݝUFQ9x42=?h'Qo\} "=K39DOq]{v鏉5;8g$&*t>FA~( Mr/GVGd$|x<yIO!{*PJhIɾ{,t/;g(ǘeM6͒LaCs2 !7+J ”P ` 9(0@adPg96gZ7?IO6tMxq0N/UEܥQ898bn(S1 }K <(I$ "!9PRbO@O\BJKwᨀJjQ8&/X1+c 3¶Xٴ]g3&D6lٝ+e}߃;yz:ʰ, (̄\=`to^k3娢 uXI͍ʬQkYK//INv,V'ۊ$f"^}.^86yFIYy$O[d,g)}!H|)\nZNKa jg #Rޫ[\S*2MM ]0aL^]C\ˢťka& \Qa4HٕsڌD#Q)2oԃٲ:;v9{caZ^w{H#D,:b-ES*)쪩E1L1lqu [@1q`d*8T0:D@2VaQ$ ʿ|XxR1+(+8z1Q~~{ݥ]J*vKOB+xCL8 ?~U\^*ԗ^* t'ئ IuBcvqu/A oQ^Gqw텹v-پ"!X`~4m)vRg3 s.).j1K0KXSO^Y}^]:i$@O/t>6v^1y'&4 ,muJR98$_0&|G":1q<;P!vr BőDHx38ʃNP. |ߗ Mt3 <0V5;qRV(.VOG~ <'W:gJ I*$X1c0f)w ǘ5f¬^v03vJɲ;@,Jj+ќ^UdՒ=Y"Wxݱ_޼U[})eE&Q ƹYNMEG>J2U*$g9VIRN|7'- -e D"Fd>0͐1d41d32קmH ,AyiR%!:tA  udf-G񰔵&4v.c&>6.omd?_`6K4 Xpfƞ߯c,s{`{2bx@a.\ػ޶qeW|={`diZn$E-JRm˕i~IJۉe[aLQp3 g$ehXkb2,t>i ѡ5נ*qι槇"_u^n\)>9ؖB]e2Tjǵ$ L$#GuL`(beN;]aaK߰J 8y(Cě&GQ0a9*s-2v=zֳa!t! il6YFyBDZOCQ\]}QJn^28ێˆ& %ʉ9!,w¼# K9i:P⮢͡0RfW /qϘH,1qFU@ ޶f-Ҧ5%Hq&E絖p`6z^ ܂>wFri" ㇅1V xaׅWc˨ 3cN>[\SJ-2*fSۭ卑i=`u.4'F|Ymk_c+ZJѺ0k Wĩk k< !{a4_jB?!\}X9nhVjTbE=9 B}X+߇- F'/܇E^1Blx[t䧄:}=,߿;2\2̠Q!.Gw@h#K JbG{ *^3:8T-]SjW{";v,M+ҹ]p@Rwu⩲*Tzլի1nV ꥮ&jqqj\^l=Z+ CHaWp?F~%4:)6\-tb%^EB^u*U%d~ܿ**?^oC~g~, b NΘ!L$豵p䉚X,Xr{J#Q)Ση wu}( E:O~i$Wioui҉bGuERmu/3D֫WѣK8V n4N/_ޭ*Uœ'\_MFǣ_^MSKѻOÇ'.>t>Ưՙ6 aM8;SM"È7rSpߓ-f ?צ7r[7fB^t$C6tNnG=SWI>22t6{p9jLcO }a<уuo? ٔ ;72b{4^0!Y!%D)VϾ~l_l7]`Ӹ7ao{6ApVc .#>] V7簺˃2Cgw,mّ׽Ѳ[#ѵ{ 4r ,d0ȄF&;]P&<ل0\O!A A䪎bQUOz&5|Kro(e,<)J?`ɛ{o7f p }g_r*zMSsuhFVHcFcRM"@-Ɖ.-GgS9 sʳijs'\ڴ,/oĥ+sUW=}.u-$Q]DT&N?@0hⓓcuұ`KLUSxLJՑ9 {9kl0S| Oc^{B")[Xg 3Pt#S e1pM#]A疸sIHm; ELFgIY_%`biF[D[NJ!;v(O`*TDuiv$B6 ҭ_0ñ@qE[ށ`L|B}OX5tG6E5"{uMA̪M!aGC`ܘ737+5t3ޟObn:ꤳ aQjpk%pEC:H6!FƷjVu~,}>^Ni*.U88:T"źA߉c~VO }jTL$i'Q<^:?7>NyOOn>g\UhUN=a| {j/,ifYbqz-&> 9b@q9}eOAS0ZA*=A'+c9J! 涼dЬf/|ow\[f<7{3.V|8pKI9d9#@ז'|xL*K;VMc"j&wlOKx픁w.@nƯJ/XeˣBmZ* vED8F{6WX߷ܦ3b&!U'hdĊH8h+I^ !PuʲF",kc[W>])#;/|JmckxVeFjBb<ھ]r 8V?C7 $2?-~#\>Yh1_gx}Y.Yd7 f2lsZ%lgr$SWi3vɌwy lF6i  ^CckI_X,Vw!fYtA8 k!$GPcbXAiiyR4JZ 6;@2bBR2p .ٞW߮Bw.@LO,;.77ak'ZM\L;  <QGYڮF( DTB,T>' K9Ak>i)bеhZlIˁ}&e8 jɣG6Jy.P/#p<" sO-znx&lC=wuq]6HiYtc\So/k޷!OD8(| TE@iA3x(N]$:laj>\%yiL"qjbW֋e@J!KL)\4rUx>ͥ@l{mF_ηE^/*c0M:Qܽ_K8V` n4Vi/._ޭ*Uœܡ&M)Oq"&i7VMsx68AF3@>(<4!Fe;g^j\fǺo'SՐ#+ȎӾ_mμ~b#_b89>>4ǣB׳Gg;zv^3l#2G?jDI5V_cq_SddNe4oO8`11$^Rad_޶&fNE'NN:쨪V:(ٓpW˛لZW%cgT؃0[4ʭ ig`Ք~ j➛E][sF+,=l2!5Tl'Ϊjq"S\4@Q q98e_}7}qWaq<ɞ .OއO ʍ^~/g_T~.#M_uI糏׳w~f9j_'K~?lGfOKgrr-6!|p3-2|}f2_cy]e!w}oJ_Ll$)  4jH aAq kⷳA+0ܿksbV(˔zk3_"&5g 6 濎S;OgE'Y$dUr -xO6mDkP@WLY68H)r$1Eh@{CkӤZ;$A!|NW??)qnh+>-A3ϷŘ.R1T"c(cZPB"R\i,q*P*( Y+3 hHzsZOПgy:`r74}MfrΉϵGYd,.Cᡙߛ%&o(1WJQFX4$ L"%cQQCJuUj-FBrB-rk+デ\rFg[# BC<|@f8JM(4B;y+㲠eOt 8CÇ35dXڡB x V;{!4 huTdU`RX=2݇\2i0ܺY\}USԔ%"j,c# +nCH: _ Dƻ&Ej΅Uzyf @2"^ELJ tАQi>ϵ,pF:H7DXG wTDNV4BK(*g!H/d,X( ,3x jG14A~"ǟ/)mߵ?vYķzo}Օ\!׷7?dH ləeycR?N!AD AU.W*Ψwo_@T5K* {a)N-V7>>+&.s1cXox~pҋZVR0}2^dcum$U(#1R A~]}%9 jQT1+yQ2% R1JRɔ* l*(p]E` h`+^Ҩm A!42謺JTRE6% )%T-f{{'rq%bP @<'rFJ%*4ܬ#xa^((iU1T_qxcz?=5v_ʧ0)(See)Նd1N0-0cFkA#PJnBT^ے%52k)]GJ˴pҵUܲ1)# ֙vMӔ f =0elCI" 91 :ɢx#ZPY81֞m9]$HaX :bpAFhQn.ec AdPQ@ Ɠ,ՑKvx$`QM0'dGE!-=d=Xf F)B15 k6Kc͐ʨENkPi" đ\{niJ!Z 5$O"bMC KqF!2jvm}wי/2-^ҭןl=?L!dȨ">4Dj[A|q; 2, !%˥D~&be@f\*+j0}=P\:p۫? PҒ`a*|1$dLOZf91> ǣ"CTasp'aA6Cw:5);[ֲp lJ1%M\j;S"` έG9|XѮ9e2`HiU/6JY)O:+j9lց]JeTVX*:+ KSi:nS^^۰!Ͷd˔#ÃW,UHpCV )+KR<2u dQ+%1WT JTrDEnS's('9dq 8LG 큓8m4Z9 L^23q}Hoϕм{t&d9(|m?dK{FQ8~D[nǭaG ˈs,-ـ v[׆. X;2Bcb1缫0FZs͍WFhDRN~L+ԃy s(kpڮ`1&zYDI>nմl^){+\Qc?`_$?3ۃ2К5%jJ3K@w?Ґ1]HQ+EBD &eVBi 株j ȲADj٣Y祛P6/PWTW?\6D$X&CemYY2(աJR"HBJĺXF'-zѻ)L.Z\~q;Qܗ59E(g8[׬5#tK`zk:P\!hYAm±ln=X6O/]j`!C\Nh0VQB-O-e`J1S0j`@o&TEHJ#S}hv]Y~]g*VK%zE^ʋ&KK^f{ 5Cnn&wgN Ƴ|]^S43P j&" vThw(Bj5h6](fV.v% {jD;b4ϪyЀ4ò,a1Ո=O*=Oi[;I:0AC˻vi[qKhx f󤼭Cs O4ܛip9}2{ijb:'G_L|h͇$jO7;YI#h<+UFwWNG?FvM64Vʕ_;˧EWLQ蚢{[ xIv/e ej5vyNUCZ%,ETuHldɥSd=&%#*y=ojz딲_Y>,7r=LRq',xF^<՝df7_c$J:DƘɇ.)VT.G26\4f@6CwxC)Qߥttw"zaT/ n>/Fo>?ݫmְb4x,17wh4jXYvk_Ո4pc h.y~45aDheCM74}CM=>Ӵ^MjB5#OVobDr8Ard h*,[b-E$Ͻ+d5ɛQ=x onϲ+<|wם)vVlULa!uj8_Iކ'7/_6IxܝqӛbZъa,Uk}wT rp1|̏%~ZЩ$٢ǘ} !;BBG$E1`׳/EzPJqGlŇQ8Ki&kܺT)^M.UTSL )|­9fzן(>7κK{rzf=W'` 8m_E46ZEw΁]ܓ ~vi"'9c֫vv}Eָy}0U; jk{k.l {,j;`d|'3;kivl~ZS8zd "T`4JZ4C!Ƀ[G \e{ҟIT*)z`)?SbvHAIudxJX^R$[)Q?=mzeJP\h6 1L^=jȵ53;"~˼ Xe5oPA#JBdž.rU3N&݋`GL׳es元Tŭni~^T'{/ap~'w{ւ ]I)ʭu<P/vvK'W޷ʱ*v7 C* *wvySlU\<uKG}z%;D@<L+ȧ]%Z=oP -8dFd*z%Uc7v~vZ+ $җ7"` 51aغ|oŶ{H~ժ-+ Wȣ0Gr X>z$ԡ(E 㞪|׉~HZsIJ#((O^})>)'1ru5Ue?uhŸVٳ3Y~uV>ǰgBIzHڲxo%bIW|xǹ4=FӔmE-;(O>=#4EB?,k}Pc%K ҙ}ݣLEv&j~&Ƙ(;:DG觳>.QH6G5AHϝ}t;=T:]=o}[I:ΏNQPuuFPϝjlhGiIx#8"b0wR04 VqD"*T)%McZp^(\N/\mxv.xn|D5X#Gu@gz8_!'،?Jl$B_m T5x̐CFlRمE('X.:ȂQiSȵ_!ogYa=+²Mu9)NF"XXab$kH\"uH[`lSڳ[ɜy(Z:YS2͞]8 [t]G3qU+I%M݌iv_x~†$Do 1W: _\+|]3"?^`Ǵr;/A[i 88˃|pG=\RK)/ߗv,5貗: ݏ{ 1PhQJ @c27yJ~_YA uq{2FU >yYE(dF;UU:]wB$VR8i㇥pu;ua4*e&Ԟ@;)6 ߳$ƚ=QMZ%BųԬf=Qݴ*;U!?ϝoAJ}#ç'bL  k["Pjo?J2D &(PA(J}02siޕ%P -* Ga'e>(a ?BhG nBG{ odVvo / rQ'z%bhw{:W"^F:]S!ƺէ 7UR ; _~D}xNgl/lTgR XNu4]7u,{"!s$jWy_+;]3Ś11yG_VrkDwpr]Cb&s=N3" _!UCv!Z/y^-D7#BtUeB﫰ꮺg-;h|O"Zi&SmįB_ }CR >9Y1s rw]T Ǭ5(%i)^"ш ƵB!q"HDm;Ru8H)sĶA>lُY8v[ \M*Df6]SZwWh @DעJ*CV:j0D#(J[Ź~24„d\vl,pyp|7EbeHPQ{U^6 ݺm;ɦx1"#Ɵ͏Adf##Ռ,D^ ;F rsZqxXP}فYlwȨ=:»5[)/)&RRڃfI_!$ yZӓh };`Q]c)"K! 4/`TZ H4^*(6H&`.Zz%"T"KfCu76\ Be>'_a-/0Z;[.Qêmu>k3YR9m.Tw^MH$TMH$r5ɜҸ)jrX$'FS}㩊W O,7ΨzW|e-1Ds[s79߲-YKD'Q0Rm1D ,GF5R3BQ+(&k4o՚&H\tN)Jk, %N/8Y$SDV=废lرN9c5ޙ'PxVN3VAdCU3QόabNJxÌ>61\ .^|xGJ9\Gƕ}fZѮ8TƤ|yvr7ዋ).\/G7fGGŘ=˴k9޴m 0eB\}BYZ^YgJj\U K#Mx:ɚ`n{nd\0GI_$q=RpS")4! ;M؆DW/˯bXHu*hbcAt*h#)eqRX 7iln?N~#P(JR>݌"6joA?$T#S9UL&7!lZhKB{Z4菷@^:Y dP kD Z6db{L_G3Dl#iAERh^hFqD p(]2nH cLY͋.E/8mgm}w9*sMь4p!h% cW$5d vFnbmn=1Ď ]bTkF -,X PVyh6FY*xM%r YNZBʃ*u^B[gIoa3{}a2WH̓Jk bik$LH w{"Q?؏zdޅ5$;|B4|dgLBG`3 y9ob=5ɂy$Ga8`C۰[~k?M͸MhƟ4@Җ,g8`iOa,֌èGeZQ^I@MʲSk8zs1Ԭˌ:dsA)lHf^qaws1b_V_vxW -\PiXJ\{m He{\TT9R VsrGDLsdm8PBTPjjP-;ajN.2gkPpٵd'sJ}-(dw@%ekյ׉_P4rEϗwBD"$::B!qvsŚ1E;9/͑ S솮7%H >UTɮ~Jݮi";NpH#)q{k|\:~S~k)9,eQ;!5A pH IѲ01BZΪ옲Nhi6)!$egl&y|@I9Bޥj1]{?VrpZ3gǸ5*DgŸhn~,ԕ@ˣ,J4u'&Υ P騧㪧A;wTrC ^8%SRJ;KELuT]!M"ݹtXRg5 }똚>BEѥ Wer^gxjVUNZsPZyʨn\ v?u#]d-t|9.',wZ5 ]dz524`1-1:r7mqu0TnZԯ Xշ_\Z?~f}SsōS9؃h_0iq"AF,.uG)VQB0.nji7 R G8⬋@§[^:,3ϪrZnȄG7&Lއq<ٯ7y?<#_^I5}/9e9H0,:4Ա aIb4;Tc֞4#<\ | #? m_[ jxYx[yޒ@Qzܴ$7i)F"뱮\e S{S4psT7wZjg9 ~fpuT\'a4]p3f9wO29{We:֑] \|ٛ۾Ó# j[ܕavC7#f}.h&_0+of`L thI-F΂;$,)@uXE[/aXPpogP*Ye'Hp+@sp ,gDkqc0ʁq8&Qw^.RG 0 / -*TRLD$w &{-iI*#a !a]u&1`91 S;"j`EWEA>1V6l#0u\w`G.@,$ VFJ1JZ`bf IbcXD1X#<6 :ON]OԢeea`@?~jaOq"驤WS_7_O_5JW?o {s " Dlz خƓx?^_I3SOorvάCx>)DHhbbыG3q#WKңWTû<e7:ܐ^4NBѝKX,{{;&m;<3"yE9wofKbA1f ]0~>ǓcSe>YNfjiߤ|=8~9hݨ_܂5[D)JޅI L:2%KW'j_h7d'[]N1hziJjjO4W5!!'.;c v!ڐc{-)ȷ`@A>*x+CEpճ{qRr1,go8\ur=͋ ?‚S§VmgՆjCX5N՞AvX`f.BA=\Yn7#'m^.n<| K%"OǦDӱ7W_m?xig2ɄXK&.տȔ!Yc+ejplɍa}܍neAW ]TJ@J?6d7X:N)Syryܨ[(Fy^Fb\#^c'e؂%txPgI v %wbBӈO<itٛGOSA籵t (}X }wj-H]Zs/w_g2y{”'h ^$#M"ag:#:& 54h"gp\*YOtLZFmAkMS>l4YI=[W= &lb^2XmHf@$#D;c%&2i}Qa4s5!:\z6U!e,K {Eщ ,Ef x^-f(rN͍+~ܾmܶ8HQ&=0砥\ڬm>`HMA8 3XȖwZ KiנvW eC+E<0 Y(G&' U4W"?6^eX:O/1Btuz7܀p#+8{en3ɡ5>EI=%_D'`.r+]\JW\6`v.& w7SQdN!pOJSDɥ'-Cr*iA%Sf̡$94ݰr0$;Q(VB5],_,mqM+ISʤҏvE 1K?ѩDYq%Yxf.amLV~JG~!y{;4-L-ݫZ}_ uA-)~<؎V('+݃?ye7jiqcoL<KDͷ:O$BOY2zZr6Fہg5p\կU$(yYe-p}W6F_ݷ)z](4FHa=O &`PkW%^ ?ƛ.[Ƈew`84W&hyÌx=EMTkqBRIu䪿;n$./o粬n\I" <⃣8-o|q7o&cf'Y}ɽb4G1TZc kt3JhjB}Ee;* VXI쓒v sUkSm8 OӤP:5+> ^szŶHXh@\[ro%r?^ƭ ΋)FOi8fb %S6Otgeb=1p*kigiߞtÔj02q/y큍f,KT[%uz9n+@R51!~!>^779YPVQ}?MCDQs;ʪ`}UzIRLq)zϊk"衴\ߎ#8r7d͹RCnX#/ՠ.:e /MH:ݝ1hL^),J?ݠvՀ~ CgO hш= !ooql 8Hk{F 67m۰Z {͙~ij##xIK'4S9Ȫ*ҜTGږmmj6rœO/œ.gߪ8i#f#ju1ß3˖vއmŅ5k<#GPr#{<<%ïro )Hr`j%7 ,+ Sd`b_Yң_H`%G5C{`ۈ{l$Dk >+%y~TXP^y& |v4BI=pP^|G-_]`JV:sVhxu%2 %\X({>BCIz>p*WL3%N+D+c6 _JFdIqZkJn["g2IɕM9w7eNFzwT[xy4"|RvS?_/x {uG1 ^$~?';p0(rxMGnJzzKe\A-MLzLH~9d}0ϴh1c|-?9Ie- rx;lnMæ{a; >AoP8Tw8{ph  ´;Z' s0ABKfyCcd#Yφ:\W!kdkG&sʢئbaXp7dJtMCbn2?ILݤj#Pn iR!z* _~n"Wh1.jYxCM3l7;K js?nQRT۳t<2]W#}sA_>~H^lt[j鱄ͷ:GoW%\T<>+Hԫ!fV==.Iq:PL᩽DPeodOS5Ί)]3`_O$9fAVV9]AN2^E!rxvmn"s@myIz{E6qw\ل%1$\L^.#>.8 fǨHEi%-n>hPx%zk 9sr W()#kCWgj@@<ϜKI-;#+ϳe|TL)QBH6dRFs#0 tGe*+A6 h4cCCTiY@vk搵 Ba_9J k *M"7Z-ܗ3Lv5pl[k4aFBth Tmz!8E'/ k 5h?I?PaL=ь~^/fC<7f [0C^c+^ǪpD3짃vS&(:`R?N 8??SdO1Rqx$K])"݂*ש&ɕIwף˻,>8"#xk7wf2oK)uI[Y *x6Lt-Stmo.(m(m4ePFmieL d 0rIlZOP6]ZJ][6+Fٸɞd<%H$KOE)Jn[ȦL-o; _Ud]n{C(xX!'BgVZ##d宯 LBH)PA繂I$4u"#"8L\qhɍ0/ݧ(jRy?us$Ib\^38fQң|M񛛛$b:R.岧5y=F G/0Mfz>VXx+0Q0f*aFQi$R(1Zl#|.K5Gpc4Kc!!, [INA]%uš'Ikf/0VոM %r*\@i@]:ݴ dFEJsK]zM㮇*8*BXe¨ +Rx`9mg”ĵg㎇pBkҎ"w7H&$KJu$lϘA0J\ӈ[ 1KDưP$U,5 ϰ `p4\j"4b&ѓH <*fS l!Όɼ d Z`'$X[`V("]紶\\ԔĻw5%A_ %(49FjAAxzPѳf5քtrW\bqN6-XReX;PV.hd9gN-&LS+$q8f:bMʮmZk쭛E'xL0͓ao2-mNA%{y`QRl#VL:)bZ/Ï_ HuN5T0LTks{kZuq VTUFO,eHFVDYXUV:d{r& qB;@*t 8-iz #D 6e}&ՊUp|oJ~O*o>%׿y,|-WGSĸxӺmdzoԳnjf<hYDc՛/bQP'?܎43X,}ވ,:s#hDFlw`fc[HOݬ☩! y"%SҮMnN;htJLIj&$䕋hLR߫39˗}74B쎇&]˞JP00;+3Pcuϳ0E<S1DGvSMd7dE_GL_FK`~*37)љ? za}7mQEup~?@^od81H(Sʍa'1o @+w2oyG콕,˿<9cr i:{a*9}L*&0,yжUɬCOη4 kMEXyB[C_[4KU^tݝ[okJjp+(z# q|_)V`4v=Vg'`/zA*8"JhbĤPR{?9,9^^M\5nUDhD6mu{ToLnA6o,l0T"79_'7۫o 0x`"(̓KEmɰ73`,d ̏_㬟Q "\|ȧcn($lkVű(1X;HQg2}W8lU( bppV;gN^HzT.pɍ0(bJpZ("y_|+U8+ øn#k+h҄w &("l$)5J,F h&jD.~ VSe0dQ QLZ--^pUh F,/T wwZrǓ**Q\&w|" 4zF?ڢMy5hk7l?Y[VUlԸ2F{᧏&~ٍf/AFh!F\2#r+>XϹ;ѥb'/=R-wWDsڅk!6<⸞ r?}x{E68gYyW7WJA1@(mӺhKFJWl/_ M)Jag09c<}6A\jθdcq $ M[˙ORLUB'\>sV5Nt[HR%z+h9V;ӡf,A,R!R-OcR I Ù( S@$E&%6YO(eL7m>VlAR-!NKue0$1"0l&`9ls8k(j ȸ:IYmRvr* [R\UŗLOF >SZ7`ۉG7Ii.ͣ)20/ы %pi"sCKE-*JVAswZͯ,_s^žC/Mp:;8[Ƀ&/Ęjxr$o)=[^h~ek;c0?@R•(r޸>xIj~f709'[|ۇ}UTGvD)2غ[pۧ" '7=z>#o]d BZN*Iw_gl7Q) Q]p_UU (&" )V$% IF #VXEZ:V+c ө '^O:C8Ҕf:1Tilc InVX}~)-[t%\+%S\[) ʃ 8XX\cV7u.q9_3D:CHQAx`gvV k] kfUmG^\QՍEID1J+" 鎄ر@5k~d /N6r!! 5IIP&2eEbQ5ҙg r6LJP2`iI&8%[5"2Z ݸ P!I=X,Q*!YF86(ˊa3,J G%ךybTtuk-hU@klE O\nXܔFDUͯku -a&tfCC,M3HOOO9 " Qh $S8 7Pп.2@1hB$\ȳ`BʰD+&d%2`q8agt°XfeY*xIe`IeVjbMN3$߷G!G& 閣]*$j`!%`}$,U}BX^c6 ,\)B94c bTj8kv4f4S8S:cԂ@aܠR_?Pn,!p2X<^]T#?~JFO lxizNiۿݳ䣆ր_ߒ"$(.#]6[Xy}{Ϗ04k~l~|{?Z<üoq?&a2}X2/a>xO.@@B?};F_#N;*U1>Kv>T+&+#E@ԇAχ𧭸ZH9~L/b5Smٵf@KM8omN Lf*rMru\*X`7C,-5W6MNqƉVVZAs ıWg*aFQi$R‘Lbd$PZ & \+M9(Í&DeXL3QI朑:T3MD :uZwYV~J+^-y< _1*OP#/AvnΉl4_k&C&Zzs,Z6ӁkE8aL;JRC _0v UYH9o9ŕ%M8ؔ^w&/TLj FG5])Ym쎞ưghh,*qY'09GUޙ5sBD}wʐ_†Ջ–s & P땣Bxu1" A]DZ v 6)@B`!xK&w/ݱ @x?I񺏏x1 H @un4|׽`T|kW_;?'9 ᐏiCLju%oOΠسe"qheRɱݣOdGBtI+,eWsWz5$גnqkӪô/-NB :T8Og]QC.xmnHNw}Hf/qi9ku\G%ܨV126V'DqPPOrZyGAVV [h{G]MɻyH"hx̑,[X lH"CRJR"x0Ѣ- _ttshj6uIo~ r^x y^bvǽ"_`ҦpU¢M퍾 @C* .4xu푡m2,hmV, խrM:۪)xwP>m4 3HaZ4ֽ>p<w.l]};<#ET5!LjXcQLjXZi)Q5Jjs>h959"7>{p?LfOZ]8p{yՋdfl^"hH~zvccbJSM aWn:$Br6 L:1sb7,'%+Urɻ+%qINhdi}ֲ  Ԙ\2U,Kam` ʹFEا' ƘYE27xJ'[xD4n٘ռ %`e.d_?IB(IB('W/?@Iwa;.4 w~xrU= AH1 ٣|A0M#(p(p\#W%$BgpNS)1&ǭ+SCTw(Bs"@^q%A4էS.ִ-K:{sMn~^^}*@4H/.'V +R%ߧIIǛ_d)i=Kx|SJ$isJ;kߖ]d=e#X#Roq39re+m](aϾLc\!*g>?AN.گU9#g#D3&*02ü #L4`x)GD`LIA/ UΝ:+c-pT?xk3_`2 #1gz9XY -!k5!c C"Uؒ#KWaKA igį?A@ @}y\=}۵TTNAĸ X"# Uj2Hq@Kʤ G4& FZjEw`PaF:Cbg(WCX^v"IV 0So%ԁ*>jcBqPNa` c+8?kOP,p$ȅW DbS r*CM@M23c d P_'WZ|fXЧ׈ 7QW; p۝+0yu?OL'dfdwwp}plp#Ƒ^zp2rM!Eo{j -sA$P-.8,1oxCJjf>5+T8tJ fr  THҝ©{L>K󊤉Ec8QB̒4a)aN!FLEBSTe&tk9[`R"Y jDFyZ$3BDaE&qEy Oy56D}#0|0|4Ǐ/rCǏyy̤{"wyX+G 4B4 Qʑ@|R4Gn{Y } U;Q1`s34oTgGyF1~O?o-grMR#w9+޼8I]hF}{ -G͓T 6X\Pu޸@n)m1OXJWIZX?h+Z77YMFT*筫ǔ|z5D- V .n]Ũ^Fæ7՘ێ+;"2"YAkV:zov^ xŀFട7ш6,[ gI9XD[%ϧ,2b*H'CǬR&jǜHO02"uN˱iÝVoZHa 9T)f4j$c9Q91N v(>O)K[Ozr!1 vGN0^/@i뇷-}5~@sPsG)WnKTMPZS_ӷ;R>/bP=eK9 C=˂ uX37(1u&nNлbPtbQƻuh nͻŷLnuX37(ѱ5nJ»bPtbQƻuE(Uk-~磠wBV) 1"'n3RYOk$K,");@&N6͛dz#Tk (G0z߄KFg4Pnu':}$mv2H"]*JX|s*r5TѢ{TIq: {w$\sNڣ.ݽR]($>@9ۣ[G7oبWt,iZ0Fz(ZOrQI5vS~sx>S-wg[U eõtW/3mB4!cMu)uĥ\OȩTqČmb3G^,kH$% 7T!s)jd!6ñ&ZfYk3 ֘tڏB-(q+B~_wOW6 lWAKGe[ĥl6%r3ΊB3zFP{x_},K\8Vs+%Zڋw3|in]0OQU.ng6;oA-QSV.J65>L]PxCNd594Ň U+ K8M,XK!R sVuKX ".q.p= _-9͗0_ ʯ -gx:M1pɈEQmMLp!<,)v&os.gښ_a%J(ָl՞u*ux˗.1E)xOcHICq(aPflgWhht7T.:Z/d BhQUy{!JcpK%=kFk^*Pr>|q.w"?M\%s41oOӠBd"ǃXT6O<~j]U:ZaպZUqJ)gBjc xFCVӼiis;4Zui^e %6~ 9픍ocHA%D)EnKL&c3IQnߌ:nt`;ѻ]-V_ý|suE%\RN`\>n${ 6]?]ƫ2 l.mt'e7 T]mqKڷ$!´BUҐ~Qׁ#MtmЛ)&B%!+)&NkCp81s w@"a[ټ5^1=mܧbQI=P /%s)A" gwP1;LhCy!5CzGZUL+Vth&V+ĪQfb)RV 톡1uАzJ5YV*Utp%Zɶ8mUj1,mSiG˶j"x ֣ӝ%Q*}YKL4KǤk>" &} QБ(KJTN<x >P" D# Zrs-Vg@ZINp4*RM,o,'ɽK|p`UNr'؜heyeMnQB, gHh'|zfQ"79N# 1rH3 R-HBYGA0+L2;NHŇhD|!E CHV kk+ :XeF8VJ2BJM9NN3JMq>Oiٕg֕ grW2ܸ`, mx*5ڭ:rFVIlpWdv d1NDvp[]K6\O-"v)SBhZ@*c\@rMaI2jIeǦY]b]z;BXhOGu"_߹9ϲ]q1[82V2Wck`z\hyf3/;{JYLw4"G" ײNz`}2}{usuˊy+3y<^/bQRӞ6N^Q)c[@/=}Uhu`ѥ8yXʼ#2sF(-\,,P>q(߲ƒLMH׳gBMHB: s&_`t/ѐ:l#/೯*XgǶwƻ6=fNG-ϻd_0}#褰`w`h9WyP==xkxp*o1|#ϫ94e* ~~E%7>g%A.pg?gQflڐNԬ7JKJZ@qeP0ZIe%ͨƟ$W 09Z2)`2r}sXmCr. {}؆]P*ΓXNXi䤛]٨[>;1$|F jaQXŒWvsc-H.H@— OL!G8Usb~b9P=O 4cY zp!nuZ =qtBwSoqVwG66o/~$4}\żjvBt9tFOċ&Ap3D6#pU-(Wrp}KyM@J\ b:qqK_^f?ߗ]rPtʁ@7r;OJbr]XHy^P0>wJH]dB-Cfa>GSLf(7Z#\V`|Z?,6GϾh'7~5^>Z$v؇dBY-c?M1ήd5~CHKe)y (ltpcdV]-on1GѰ93`qnqmAJkcy^ !CN+ۻä_\U> ~y$hjdK-B52w' .Mhv2ӄ G7|kö-P2ͥgKe3H4/\/$uZeyV8 2B6lS̾FLSR[*zL` ](TTaH~w ش8?=zHyJ:) pB('5(% rCs̒J"qsKTSsSpF,JPTF9r^ N0xK 0-?n(^"N)iN sseC>u*FfhM$ʹ(ZʓГd'Vu(؆,M didCuW.cugu L담ֹӣ|OG,M@ ޲R>|}xEEwhWMn  pxtFh~Y:m$QN-0ƥr^+:0> Vopb /P8yZYq6^ݕԆvWR2>h䀦@qx+ج)ŝ( R'vfv$&51Ƹ\y͉̅Z Cs$yNp ,2$(1-O5Yk1Gdz{|'J3tmcdlHf4+4L:}P 9 Ue̠L[/eERq>R1%h1%.#7RA$y)pmD^IT@SiFb\ΈfYTO+8ֽT1Å#ҝmQ\r =J1%`/߁KH RdQĥ$ Ȟ>si4G+k2h18gJ5dPf<TKELAvoKBd=eP`EۇFn=hwi3&C;u!ᵚz).ME:Dw;eΕTF?YFYf$Lf9EFPBE+JȀ#_II-A`@XJP 4pXrwXc ȝpBHa^HvY|5[M~ NJ4fO63;/~l-o֞Oo?=5P}|* `ݭerAprJk+N+֠m{g+\#zJY% zuN :>4 ˆ8AfJ & Q;$*c$>* 9b0\nK9Q$P8nA 0=eCD.>-_Dv!b8$wn"Rs ]tu/Jt AzusyOmL)ь{fOݴz98]}şG_$$ߘl~ׇlzogXbއ>gǔkst#vȼyssV'ÒɩåfR1qRb5>jd{1 LuC+_Weϩ硗-k4PoWcOz1QV l>y9ݧo^W QGOp'r}; cʐʉvb3tRhbݎO :Twmk#[#w>DSƔ%QjY9u*|2'džRkyMhq&hNh3SaZg9( O:g˱N =|ŖD2pxh[1|U4&$6r}x*W/A.w\ֈ҄4<|usA$ ZĽ7aG>~- v|)r*WnFBʫ7R"*GSJr/d-v[U7DnUS{XRRC)Zny@ VA]qEn/.I^-tޭ%J5Qݠ={bK+/F)T8P޵>m$/ٻҼUde˵W)X\K$CRrėăŔR3=3d`VLdN]-Xxe}]qY>‡pD}Z1}!_Mذai!-}jrD^p+wq4r+ 6=p=ai,56x\V1Д5T넮=ٰ[AWy_ 0cZxN7پpSa'%5 1o'r)O6Е/|7Q׍iԈ"jI$kGdho3R{5hi!,O:6D94C=!栻>=TrOrhVJP.Û׽oK1DwRZu|IM3tᢐnqQl`4d)7 \ؒnc3? :ϻ>wΜ^QJy/a58z3Wm+[:;(b"WIg=ُ{&)>skQχ;7KN_2S.%IToB~ /D)t9aŨ !!4 :c*J(V7Ibt) ' bb%tvM(NiJ(٥(fZsDhfѦ5S`U-mâ ؠƄI&$H!.8<ˉ6UqKAHFnBh2ޙo! %1JpYµBW#H%5h2q0Ecf"Y{/Va4dT$68Q1-KVm^%ӆfF*݈1ulX'G:bZDYpRUc4^ʹ=g.vjDOY`Pl~ZwkꢿhړUsՉ^Z뼾 +p2z{j6΃M^&*JΝ*Cԛ9iq9C);`ѽ,@Fh$@8d6aD#ZK$і>w+X}޾k6Am&Uٲ$<*#pxVwf/ Rc}YUx+\PX,1VGG[4daGWjKTD}矻 ֧>=ZYĿ*A'BrqfHJ\bH)P*0R d PF"ʼn] ʣ08 9&HiTqWNZTѤN!ӳ*rRVvd`ISN3'#G\% 5iroBZPUHdJvǷ-I$W:;1X+JF$ So9#5?I0ƐneT帖/ͺ4{bMi/_z<\ rjZ d"d+ yq)\.Yz/ 5)#aZ>͔Lu[vO{f5t~>IHn R3N)ۼa}5#zdIPh- ER"[4 Z٢~U6ԴrUjkkUڻ ^(~/2YG!WŝYPβZVY1^1!l`IP}bT`~!fP@ZAMk:H0V"V Hpx{W^33[Bѫof9uef,Q! "!q[g٢]Lʣj吔}PvE6TƾknN֜kn M[_hh{Qj Ytl#.3.}^eݶ 7eWov,Xk4c F$fJrZ4yh2"D<<<]}߬+sMfكCq$TC$^Kh1D{`?npA4+Y!5GF1y&Mz5:xjѪE0YjijQLMS~´z8$ uQb&/kU)^b+V~t.-yg+1w/V $ᨐv9Ku)?ߏT 1 i8HJ=WiX7A.q HK_1]l]m:|Y^~QtGQleIZKJS$VIql#<ֆJl`'<ED%ܥ} $}$b<7o#a<z8 쏱\ z0P@]*O)XػE,otQ ]]=&UEb^f}>AiաN xlC1bqI/xfgg7|<Ml>M.C}߿\b,|W~zחUG|si6^pûׯi'[_ ^?gSbG{PgXqex߀Yy4 zso \I2Xw.Eg׃I"sk^| *]h:&|Eemͳ#aA; k'SWࢇKɔ Guw_x4;4|;̭ӱrV?<?Tt2=,z8LGBx/}Ri]&!f f:Zȟ͓NQDQT٦JE~|pۖt֏{ƊCi:V[vn{j#ͺ|1;Xcxc ɰc˃rm}Zt|>1ȧһy0[!.-m=~ӷRPY'S~Y>%'OU%K ]J:/L4%r4 C 4=kX'hIpeZΡ( G  :ަ/:pnoZN~FU'kĵ們"&P?@ esz<8:jّruĦqZ"J@:Q^RPA)j&9@;h-F"'ch reW nv m8:6"~֭OdԔ]Gg]j *BN[©TNavi^0y8SLa5?!$m:r\"Uz 2k.DPWN[[};_MT$ACr))Ttu*8\IDaRIBHȐD$\F(J'dkWos)A(>\߾'%#b QsUl9ys}lUeYVH,r zV0W8eOvr%=[gSb.fF1Qv/CGhİZt7oE캱Ó?F66OAhd@U)n*hgg'5_h.=$/-HOr"Q]<>3ܕ`HIsIqx'Cp w0Wؓ^|Fd.?g4sh,58M7ny ʤ{cƼswuu]e]I<ՅP /ڮ&¶LzbL~raD<)#cQNYL&B2VGLE)Oka%Ji$ޫ o>n2ՂS~yJǬXO.5xF7oǭ dH&XgAlNg6:M E)}PB/*TKiBʔ$Ҧ0gJIf(DJMH4F$k1u\g0, v&癿LI<~7tki1|T|e~ϹKPPw}GlGA SI1Hq"3*1P&KA&>>Ȫα* ƣپu?;aG6,rɄni孃~nr-2 >n* Bi+iW "BD¢-}OW=X,G-؁a~ܚp0f~8Ma]ctpw=I][6+~Z T!{R[]v} ɎjWR;qR)@ ^n5|88wh9f-cɠzeK\1t4M7طV/}9X2a"<"2ZĿf'Z6R\8;Hd^T=u.NqI,3ݚ5VIV#Q3$ J,K@kr]asvqͿkD]6,p8`iRkp*i$c#3XR4[-6K,h] sݘrV8_LXD &V`xYN\ceJeB38>8W4ʗ oPbr0e UVց_K\uE$z]Qs0C]U>L ZƆֳ/G XgUf.(aUD]){/ }^Z]BjϥUj WC-pL?s?70r']"l&t2X"Ka6{mj0LȵXHD8gHTR CtZvin,o0cE)jRMc`ZȌ)g5J'Dwݻa0S&v] (ObL0%LID3ҘRL -ud I$m"<&UVk2d2ƱfpRxB$e,e,pb%bz1X/oA#\2i!L-mq(aMщSu 7DFY{b cM;޺GY^伩s% /o=Uu$ KGуCwOW1GEcԧ1g7o*OԘ N Q+efUVԂRr2llrjnK㣲 }A\n! (thK@w)x4Z^TX#( o󡒢/l%if@)7!9t ԡSb$jiߪ6f21@&fHLDY8]aunƞol*I&Jrb8bP!DRmYl9(,7֨-_c,!QLa2e Z'e)7Xx 13XUUXkO+ V &V+͗V6AmkG|{vJ|7ϳs*mļ\|ע̧N[" 4"28SMB7a D*PK&FNe$ Rv6X:yøfX$@ZSuHX-fNBdL+ PA$I Hy-V̎cM<4ZZ5 KZ|A`{qo/NjMqij%c1)TIvhƯԃ/1,N]6w-6W+WK(Ru(/j=*0U &# $2p>qw@wɿCE%K휃%4tBũ.}||((~7Qi)YgI\2 惡*ƹnڵ CFWcҙ%&{/4A#m/ XĄSknKuZh6OrF-,Uybӷf+|ϛf >PA11$}!("6FAHp8B͗KPg~bomEyhcnL^LޝwnjuwURoιXNFP m>?]۟b۟ 0K103D[akj(bl6*RP0Jo8oF77};^!ȰU䮊ϓ ~:_;*RJ8k =[Qr"=9^8"r:ZoZwl\cz"2]51u\ju6\)hnLJ9i* $&Мia]z7nnG|QԦH^4^G+6~[=-Ҧ$M)Qڒcc;dt_~*#wZcftAJ l?Aֳw:tÁ#dQw0q m;]nSl`L5Q"N]KtYЫ ,)$Z7K[a37SK*İҗy&6I z*| 8a:bi $e@L*1Τ&Ų 6vQUO?\{85tf|\-`:SMLŎG; !=Go%rfFzǿ`'l}_w~ܥ0v'} E%h̐ ݧo^ SB"@˛,W2 VIuh77ރ>[6(3$;Sd:_bB𡛜ޠ($q,hϽΟh7Zn@5#Z_|)rV'v3g@׍գA"=%77~k8QYu0#Q@\C˄J-!f bWYŬC-䪄Mk/ʳ6c j]gmwU"y(Ϳϧ1>l9Pތ`XRjnr ) +@jAS6FB!)֣{#)|q/x o{Sj蛲B  qvA,{OصӃ4sX%¯[wܤ݃$/tw]N`?%د/G&+w)cFZ'v.raf)лNw ؉i ʿtehYFOcǡ J1|*{5̨L9xT :NġsK9޵(A̜pt q{AD<1:t$['JClk!% lCjO;C`.vi(NǵTQ܊dk-UTEsNH`Hy>b?IX̧o-UxkxM=MsKELhĨE0V#[dmX@) r0Lݡxx_Kw/;x1pOA/;xɆmʗZ??rP;T"!}vZG9 ١, ߧ,V^*/.Tc'+|[u{Iy^RE'Dsrs:tU׽[!u.ފbkLaY t6_|Jy< riD%Wa y؄M 3kTX{Ra:jr=h{旯FM‹Mj5 _*FX`©,b'uqJv{+Vqc'%9P,62eJ2n%)%%Hu4@7 I-~vz`[G_@{j4A]`[H/ƨ;tF&Lyw,9ac_>$#Nj5byRLHٓ5J%s:ɯ}E(}Х'/i7VlY]mQ_d+𕿑řwXQ(N@"nOWSIrPUQU C-zU mqpL0#*7VQ-y VT-r~3!$Z:;Nu{~>r*U8#ꥪB]ˎ.:|ш"y%9E(1tM~ 3]d5)pvwquk>૪T=;.F!W3ӈP_3ĭ{3@ؗ޵>u/vZXgc;δMI/hʖ"Nr3CqxCbDtt-]lL6Qb؏*?3a4?pw۳1ߎrIt{:v77jiI \en-5_Q`L'w_|v/.Y拫Igiod@ IE.feYd.s"a*rC H*$i\zw1'[$outt_ +nUp׹R\Η)սRk܊eU8-skho/x9JKYX7?+19NI2!dch'4 AZSQZ]hKZ #8 ,0Cp$&wAY*Z22>[tZFсj V|u _g:(]o/Zbbϊ1!~<T;Zoh$*`JqL.> _<2< !h< +Aͩ@[oP͙_=Jxhx0m t3Ȑ=/K=]O։'t}*\^4z5WŃ6&p"N{B)ln⿞|S˄^t3Vvomڳ{b̊K?֛$yxO9?!9w|yi)}zKxYߥ+o˯̗h5ըƿ2yQ5u.[G N1ZIYtҭD[+/NkJmxv2d*--Ƭfs\(`HrA'#wg!FEgtakvGp3-{K9 { o kNC<+5mR_kwܷ.awߩmTNǃ裀7#vܬa罃@MkMQ55t#e VzIYh,e1Bn״M)nZs4aޛFmX`lȀH9@$-hx;2QnNG[rU9մnt;:^Œ\ۃnbCZgnt4s*X66`-6n9*T9 XڠGۓ-E%C Wh( W'Ds8 54klK=4nX#G.SȼKxPtd%~#Ȅ.;*K"=A+Z4ف NT#0b&}:+ïgׄ&i._5:{z?MTKFOdҷ ,QŔ\zz߾{Ei}w!^p՘AX9'|~}s;| t?8)3 wss9C쟟>~ Khl:yaAë?x -7ݓ6w0-y)g3ZO2yQڹȜ(y*R4Sz,YZa_ $XRP_|_r4Eh4rg=\&3Ɠ H/8"7ӳ3,T=Lq{Xv\T>RV ԏR!ʠǞy6&E#!(*²V}Ұj^"IkctL7W!V1H bcW\%V{Ţ N>AO*t_#ua@Ȟ"]Zꁝ|X= &z/CCP`^stת$<%5Q`50\ v,նr ^{ԁ}*.n@G_ݧ+khz4 amR ^I`H9}\tqmċ?g@dqg2pg,f ZBgo }9,SI$B4uNr k KQx!qPrOK痄*eH&Q(w)3(p>=p;fT?+mY+0o8<Ć}I?y{P4yrɓs(=G^knйGhOEvʢNcRܮa ܖҋ2KS^qA%?2]n%!p IR1,n{pC1ߝ8.Fb$Fb |]ͪt%|Ln),!l6Tajty}~"xfHMĶյ5.)$$Lzp}%Ciz|]_&_'ihW$58T{fѐWcfez/CBj bֶ~RH)[F%!Q;!r`&_H C#DU)e4 ӊې+bƫsAN+!,.U2nXiyjUh!=c("=tcK5D}I2B"* ]Ҋ=d˚,ELȒ$L.\q qB%aN>$5ZKt Z,c\vzҀ6d, y!Ib!E+>*q9^ rPJCα,crLA@@٬,yQH,%%$3ҨnhYVy<两'*1W:ZAtNɖz`cA v*fRsk6tnd넎 |<"% "פ2XZd29@NXcR/X=۞1?hk;,O-.jbfR|N{b-M*˭yNe 3 -0 mutzN>X2G[!Z&1ːءc.!&έ*![ $-$dR3fNrmZiB-L15|`MJʢ͉ΔDδN3sIltL6kipZ^]˖rrV.[Κ-͖Zy1eX`>E:ڒp E5*:JJY?{W۸/YDvG?$ ,dwӣ,<߿jRnII_U]GExmP{$o|_}(Ce<㤳(:Ԧ`$2a δʬ`C#CQښק ԏdp}#QN5K r" t2*p X'؀17c&RLbbrY'd`|uP?*4b+^҄C B q,8qchBgʂV1QRdܺk@ }Z!ͤ,dnIEԍ0fl;e`$?$#d'PY`P8ƚ́GjfQĤY6F4F.Š\QqXz&[+ Fh\8xJ2 O%D_f,i $d'!cVߋa6! A'q0N4xpr  z;³,8kJ}NSݐ 9Vb}',5jT8B8$KİHf ݦ|PABTd)ՠRb29cm `0֟ЋN/oF"mIQB@5"%&*i.~`3P.1 !Ϲ \ b{DcG}$!+a &;EB1E¥tl\LQhIK8z8 I+A )=Q`N;(2ḀMT 4x ɰɔ\pY1¨mO f`a`ISFK8/I"-$ɽQ|<}—HO)=|M,}B(R K%0F ى{ILIqpۚy0Cs}0CnShFWTtyyHӐlLHo]>;,~O`=Ow`8au9q6%Ӈɬ#0\|=*^L m sxO4|bK|wdJκlGe> J }>M{>~ L@EHX $`u*\).$uV2We5МVbp!v~pAg}ZWĸndf{FMig}ý ?AB@^*=:tG!E\J4/'|~HQU>{[ǵAw~HQvxTKTfaXA u kiGA)K!>uwTXߥ >*,\&J/Djηa2NQY:Gsxsy9J0B@"Jɒ3SҶ>Xf$u4_r^]=!蜌U4F*ѫZFBbrU:WTNVe}o^;KdiDv>7)Ю)(5ij1}%"rqI:FDs mjaN $%;&..Fɋ-j/Y!i7_EUՎ}<ʏ gJe̺ lR ҙЖ(N-nce* 76_5E10 sюjuo Idw4]]ގ|Ꭰt&Nf yaǯXPmS܀`{Q  rҀtD'lbg_&Ie*?H%QJeTj!δ¤@ 6zA@d k,P7Ylϥ;x74"_VGWZu An{DAP : ~6$L x!byhN=g#ApVqPq2gTh5L8u l T@$hoF@ [g$T&KͰ??'(kSM&:!y%iiT->%3}2}_}ɧ(WbP*]2D7է k QW_v Qa{^lٜ-fo[>CH{3츱sKgnk{-gg{R$7#5R=nڤPxyq-JU#R$afY#Mjp]Gh.ǖAg-EC>s9O+y-|<96>(FLUCenoR[U,ai;X o?9TUHD|קkE)v=:c FV);9تXsT ;9juy&8ŃCiK1ۏݏڽvnбDr_{Ě3X?]<#ϛgN țw,0T!:>#w]\Jӄb:*ǫ@Xre~~ϟ~,J,fjEjpR;~fnE-K(L.FnXOk>/Xniaj: tk4^|[MqC+$~cSe#y\&whuyv#Vl51 -z'[HOtV%VA I+(RhhRBrŽX0e*iN٣IRsPNc= zwX?٪jp+qYM{}sw p軷>O.ł_&n.YLN_,ݺEJ3u`QjEa}q`8FIɨ.h]OT \Ü]=>٢Q|6CurI7jBƫuaщrE|=ebImS,ZY9n\1']v&6cƞwtvH#$v4>Zd.X|e.0,aU2T#į'N`N̟K%{'lUTDRrK BDsk뙕";ξa?kkARǭ^^$ܿ~)/w] "nBB^FTi<@CT6jmN@mW+p{c"ߨO(~`q|@E܄r)XsnA~v;a[|@քrd >sr Suw 4S衁4уD vα_s;d c:zK.ԙS3WW.a?Lǎs/uED8O}|)J9?RM$|Y=D\!2xJ. ;pK C'pdJh|9fFq;*APD dڙHN:Dfc0̏+1{!~p@goǚw qgS.?3"ѥlD-1&c{ Z?0Bk*5yI±:?[CWӖ} Qg{jі~8{/F%xn3ښP}]n W08]CJh^s狝Q 7fjճ;7 ކX`ܳ "`!9~A@f<r_}t__,\ba.kG_&ca3"M]U !ư8aO 2Ď3˜2.EsI74Jai{+ s+6U4bV>Niv3}=ŸRj1.jPZd<z1;?yV7WW D-)I)M3RLː2C8FpBH$fYEjEҮ)@R"A%F bg$3XbD#G2;M\K܁g$DsHs s71vF}7?|U&+xs']cFc@!['5(5 QW7@ 䵧\ ō/&?jǙwAY i:tNao̸uY0]gy[Uds[бDbr :zHj~~ZMmD@)ח35[]B]qQ0),5ڂ2V10[1 22c:L)Jh 60e _Lq*%85$TIClp*H(/9EC.oAn)p?/?#;<>Y},j+JBpb(l ]R,QiDL#+ (!"5d Wf`*).s[+*bc$S26n:OV4x k3[-vf%Jo{ /ciR}S5u4nU^,vn~s ݎ(V@R0r,ߖ31 ?;'`71KL6V&cdLkbq6>dzӗ,|j*&͈zħ׋0ג )@A9~9|^I0*!ljrHdľfsYd/_m[*ِ!'۽n KyO\J'0{vahhggGZޝBKh}#sp&T,I$˴И!l$ ViPg8ipA\ц+}58ړM ̯g2[j(IA58$9YW %8K&HfX 7z jSR} !tdYMkćk%i v^o^!NL;>q{ʷZL^LCL/(!vfp@{wL\ Q/ {яg_>Nfŏ,Q?L/`CSC;㇟cRЦ9Nweq#I2;eMi@Omlag]kV6'u*)Œ$KE&_Dƕ]s z+6_;7QLA͹t+/7\;#74Lhi[EW4VƨC_O#Ccp:;}uzboXؗu 'Za) ۵5{PoEDn1P΀[yLpi@C,jXjd+.}k⼼j"e¬q$D-r' \V`h%s15=Y@#>R!l&;*p>K2\@rf' RzM\! ظ2b5J*IC6w9 mL:0Rk޸^өfDj*_u(~ۈ\:cOD Y-hT!E#( >|_tӪ=˱ATkU!.G&ĕ+;WZX1w"MM-E~6$l]%Z8KgI4UUmE<s%z|9U?9:zg@lolL܆+WbZS3֔ )xh/?+=48/'̀f.;?׳4Gc$ ]@Vcfk"E.[ݲ}*80W3tXd-nS+8'(߱xDa ݧ7Ibp ʠ(c D`T'ɓ <9%Ćv}0  4\XDyL'ri+e.t>}9ԟe s#%Ysԛڈ+`,f}EQH%bM3M)F@XwûqP|Zmu&: y_5!Pp\v;PWucIҊ$eiiHK]2ge]cDi2(F5KHz TLNYZʚ@΁6R2CDZ0H?$['2OWeWߞJsnH%]ZF`AF+erN0;cY{;qJ8JI,TaIǷdZǷq(ZRVY^#:'k34:Du;}{#p)~ug(Hθjm]Iah27e˜5]sq ͠+wSwJdMY8'p<\_6V2&Xjw2_ʤG?}@&fvTώ(!~Q6C6Qsei6S3TX<;zQ}oru/&~=D`0LKS- +& Jbfw9I37K {^X5tܒӷB)_pQ|YX!WYp̸'"Ù18zgkuj0= fq\@Dfl#Ox"OC@Y_ͼ_ F`vED Z2F/IsEE]m}$G>B:"\ӽJ4/4H_@=7R[TXǙ&p8묲 T23L HM4DMS9 A[)"%bT0DY6 qSvU#m֠CD'`۩>Gk c} ROcS_zƓ{ܓr:oP9]== }Lbxz0t_oq0P4roӉbV[K~XZt{[ḡ\Uj6vQZ΅7ճKhA, vF 2cBIڦ|EnH֤m[2–vmj%Ռ/ǗtUl"6Bƅ6Af􅷨>UKTA2vF B"dFkqfuf-%LZ#~OSVϨzqP.\~: Ő6WC$]ȇR٣^~=]'g3 I RXlml &RRhp@;1Z3fh%eB3Mʸ" ̀sd5x˪c w 1o%G2fiqD{B$BȆ/@6}5JSg,d09ZJhON1=3C)uij޴/1Q+) eVJV4,ѦZ}_>'-Gls r0%؞a`\]bRS~dɓ\v0d}SynV #rMF97[q쑃>n8[vU*:>p"z&Qξ-nI8Qgf|lCcVpLǣW_g~:ٖ:᳆f%k.1UUP%Igo7.?鬨Ѯ⃀p4bAw[d ݌;}u&>=kpExIICznjcࠆs,E/GTC~w#ct\HAcoi8jҡa=|R 90**})>Húp̉rÐJ͡TP;c`70\ A Wi)z*?Q^iD`U:/gQ;kvLY|_Jss' xu5~]'QW'ҒtQkPXȅy]Xj$ qӜ6FkM5?s>3Fð g:z<# 4~zi:4s`|֯NN7/@h%(oh2Μ(8#nGƭU܋G<^>M+Z8Ġppye y|"v;,?*U[F_ ʤ`.5=여ndHʾ?u˓8D:ۡynSYж]ٱ$(3#C5mo!9*s zVi%{O󏡄;.kS硅r-O1 F%xv1q̆<ȡO2n?g>?\0+:}\b<_~kU$m~X?`r=rtz_{.t2~-XnS>Qzx\Dur ><]>~^h^7kb22`Q;}{ sqJ~}ax Cw?Sԣ=Z@(pnreuڊSBC T`+N h3;βOl俞,yMA tCn[ $҃AvSYZ`0H%Er9*CѭUn(z2)x&NEZR@0^1ƈF{IMŠsJG)I0dNaCsC#^nrIIY_Sӛ=Lp=xu= F<:<}@iӠEu٪ Ł/S/g7qՠLHH˔%dwPxL!Fdvug|vD"oQR'Ũh:&M}8W5?OГ7|Ymj4@OE;sy@CXi*aijzy5\eZJjU2v( 2/x@7mJ.-䷓WwN4!`exg4.w} z`NyУQFxWx!A5d"{ސD[pи)Er\ 5[S =i6z(6dT<{ƍ0/G3ox= M%ѼG's,oQH-%jXb*Ⱥ\FG髸ߠWQNZ GzY(PqBʎt|$UlXuar 4 S^QcȸAIqQfI-BN%u,t\K ؒ!CQpݳz8R1afJ`*FLR(C,7rN!9\C'F'DcZ.C}Ϩ2kJPI -BtM_?zx/oУm_jѻxv;DbXKded@l)>`KDjЮ`'*SaJ#}B܄.Jf!,'G5Ft|O!dnaCK)!A;}~¸5B`Dp8|Hto($xȇ)P1ƅۙxH=M; &bW4,]?qԚ?C6Cy@uK"3wܒ;3el״aW_{0yX~~`W3Z9{'pQs*Dp%>,s`=>{4 6|}6ŏ~i00]aEvD"y9DVV { ^~;^^|Iˉ!|=adYӽ3c\.G095k^}{OHyErXmζPAatZv jA4zw;DDj?4;K]зҗk>,nx30m򞹅ͷ~:S;;Yf/+b"1ׄpRR, ,JqEկ@[x~,e VץM4!?~ ?γ4~1jq*q>'nyc_xSQ@,%?pfØP @]x`#戳0l]}Ido p̐2%2%HDFO_<#`z|«ZK݉`U}MDWA48iO^ Fu Zj։o;qCiTiB0n>s!`фL'g+֫ɻBT1<@% JuOUZ jk[5U#;jInrO w `"YY_ VŵU ﳠ{NYJZt;kn5*}Ȯ*}V`4aV>rO-sM(3'SWbէX8JhSLN9cw6ZteQw7uVzEW~_i~M@Z WWvev oR+%4(ZUV%3iC uqE1!% i3N7DN|qCպx72FJP'4S(苺D[2{3opԡ3I H!%g]B̚%6K*Xq]6ڀ2>Yo4V\kƷ7 YZ6K+0mSVJ$)a0(5D7M'h,@56',' %]3&HE*r՞xx' ڞm(1;yKuLYSg| yd*fw&Y6ϸҗvbTKP{Jt+'x}DR3> a/ײv_5ȦZ/Yl!5?WqbڶՀ;RIZ0,zĢXo|(.eȍ(R*ͯonP,`.oސB$'ą6!G~fQGzE}QV#qLDc@x~ue"hr;PDq!C E{E* ̫8Tdq/~h82Vҫ1hܱ`zkДb/0jG4Y1Dy5 ߕSh2yO3 ri@wL4gM,Pů4ᠥ] bサe%Avn C4[Xp3'Z:D,d-PB'&AkBAEgF9? A>SaQ zwƢH cQ.EBAY wD;"ox'~gҮ' QZ {98zBAJ ]X6\sgUG]XftbSY/ŞnJ'!ݕ8K{}twIr'rAִ3D>]:`XqgKٝف\Z覯E :dPS*O|>7aukݲr}Ðxo>u VI8¡T;rUKghJ79^3#{;U~7MwS"r2F7&9?~I`K^"Ҡ)9W5 CdWUŶ$}* 5~gʓJK:[yRI,~GQhm* _~;^^FO 2pġVǷWa!0ZMdZMpKKJOHĤkZn Ng%) ~eޫ]N"/9_Ɯl46BL 2#a=J*2!5˽\j)v&Ŏ>CWMn^5.  ~Cv2]% ݧA98'\J ͩdyjZKP 2ʛ,M4oOኋ[u/~m@F͍(45$7SÝynB.xD N LR`C$ X܉,.Ef̋9\4@a*Vy2,.SA. 6Du q@5ܛ3b uFj, K,BS.iy+T3Uq@bm2i㻟F&J{c;>RW'RƯJUIi'XZgyY U,12ff!pqm)RyԄ{6FHU-hqVmT6Worx=lEmn} 7kjДVTun,9%Vʷp48ٻ6r$0dc|l9IIn1[ݖj9ĖQUX,փX z/N-d-K&]v+J gw~y?%[>3@kČ &*LӱS$vCSܒ' $傞.")}Az^[(Ԏ]/~J vMZ(Mپbh4rG½F3PrPRe (jw [LhKr+85x[c-4vV#tm=Ù@znVm18xqk1Zjjgk@T> *w=8xQq}@kVVZ);)z-N=]q& ܳ#f`Qh7JkҾKg"0` t)$ ާo3 O1Lʡu;icn:H7mCM)Hz܇!&b>عp ›ea `kרգn2JiQ8rZmrc]-%ŭq08-XhbJ,l0tl"ddʧg}%&b@p8>aPt8!?kPUJ2-Z$ Ә"h%„ܶ)A[ݶ)w֜io;Zaˤ͙hC>I@7cʐs)OfOVbrZiVd]6D`RYth6'H)yGOB'\ 3 .%O ./ӤT^KG/4 >%B#^Lja*ܝj`x~B_wu'(}IuWdmQ9RGp]b98TAզmQ0q./1;b1~R|.eҭg%Jq !,k\e$$!߸v)jS&Ȟ2,H4T/`O(zI |wtHAr:ĕx?y{ZiJ“§ǖ+-Ͼ\bqSYq;"xl7?;Cn9Xv0CIi`w]<Ԫvˈ`UnV'&#͊z}i(!r">a$;2cd Hk.F:)k jΟ)7.=䨫&= da݊5dk0u?xyځĕ]?:+N/Wڕ#cPF2ou3:=Mwb⦡vu'O?nTNJb 940*Fon 2$ZbJݖάadv ;6E<n"*ǎwI:u&E{^@&O1AC:u.H:'A^JjFLNu˲fCa>l0 &/nȵo|F *d$?}f#AE-4@ wt ŰՊܗ^IP#mjAdϱz=`B崨T 1#(ZȞ_;p4)!(cNp2$[73F*vّg}1Sڬ{p }-a#'gādq'}a֠F:xQ6}EgQ1B3x$R?:k풏W j$7b=IYTƏ!*rq.IwUXuы^,\xuV2Ȑe73wwu n~u;ځR!pR’@uB(Q>JVPʴW,гxWp&>{}Cs_l]@$/tǰ|_Yr3~SzµU+0a0G,}K s~! e\H.BrY_Hsݔ^J&6"u ' _(b ҈ 3乞 Dэt W.1pRQZBδce.R*X[hfrk2*O 0K [P8ԓǩ-6APh:/IG}XȀnIqFF b&<7 WQ9arYQ!p$~ -2f_,ĤֺH)ĉ5c%I(E0#(dl²YyA&ml kJ00F)S@F@B5#g-1$n͆2C87%-/z#2kDo^m8Wݷ?ۿ Q25{CUl/yWOw~XCzΪo ܓG*AkmM \K$1K!xqCcnTN}P%P~trFXS[3- @;+p*4ʗlXS2 &n(EhЬDK3q_f抒RQh7ĆJk z!׼goES#/lT/, ӜĄ*mh lH8Z{&EȌQ@ZoHHiIv(vvgV끔 "-5|N?O»S9~uR#8eآB5OVIMuˊGib߂=>Grǟգ3p7}uiQ#v3#ӂKBNdZҹ%pc^7f-ǘ#CbB n;1elDrj 5;* Dkx2O KRl4Lx Dxm$gT5bwTkТ)7hӢx=䩮5 ES:EShLnS[Mhh@ARwwj@wuV3Z܁&7ABq,S i!}7[wڝ(Z{.(ԧLYɦ5E(Ҽ׮Y-1j*5^]vkC3&;QtrtfEӎR`F&-N<>dNqù#=73X 7DmEJu2nnj/0nB0 c  eL`g~Vyf΄ z8z?teAo*25T+uYܤg%Ai3`/5H|GIAA:*𔴸 <5&]څH6 7\j; އGNSFP(bX'!E֖?/ wLrAH _,5\M931'Ttj5"n0mSq9my&@J n$˯^릕'_û7 3HVUkq ȀBv'54Fcޞ=DK)`Ἓ mA3'Rڛ6 @O">i'3( ?yd >Riη9RSБ'9?E$4$)rPsLi$a `DMd" 9Ԣw*2Б_}PS uR7AJ2i5'iDsrA ޅ#2BM;2e(ouy7ZY% U:/8s YR2c42\atTWR(N&7}QVܗw`.@ދrW7`m)x[j(& WfoR/c^so(E NSX=!ġ-y)ch&!h4ե)bkA*8Lzo=$VBpI("%qTP#TxT2 xV(:tMR$25Wnh*Dm-u ӐXdI7OE4=iv PV3;!kYlHڢ}ښ+w(΅tD:)J*.uD*%$Pъ/iiK[BvE]x',qY 7`-O {ÒI@ipA%LxIX@!rkIFUmZqCQRMb#Ž{!_ˣ| zk[ۆZEjhJaLuB%.PCa@bcL=)*n@bVo=2 1PJ6o Ʉr&ҙ6$FBzxZ¸aqw\&TwwW(`._&_k* W]wѣ\l+.G/k30_5w#O&~'ǠuMC ZoEap3;d=Z( w =GWD5|_{&O3˚/3QUk5%Ƨzu3]M;!: o+5mh(ThEbG0iM.B1IՅ[uقXc i.bjq;ʯ*j<^bqWhoxp-pS[|~zr}=hRXv澹KcD eQ@T5٧ۛ0f+]A#, gI ĂT\mV(ХgUx#j5r&fH, ڂQ?O 'R}g VL_"LyPN54ٖ)"iT 9UE`P@ >uN-]wuZQw3FA_*F<c?WI.B.WBF!=C\s(s&*-2Q.`fuR¶pJy)?\_}w*@  Χ~E-{=L{82qiWJz,aAB< -A3g0bA!\`QۙvLS;?ؔ%D,X'dkdOe*q,쇋.0= }߷P>uen^ppJ E!23,VA+}êi{NN7 (8 - N_I뻅+K̳E޲~6~XW7kihG(3f]0G=Awy'zyWWU?˯I!~G޽}h?: <ɪwH=?2_,uv:>_qV)35;0x;Q-E;VfcYrt̸T))c并u|qQM)`ϺX GJ3f)͸JirK,L{3i< hn(cgLxC rkK(`^j)G:9qM1%JWR-: ALJD0%Ba/)>q&YN ;h&a!M иM֧Z u8t IO 6#5#1V\CcX %`RNҪ1ɵBڀrÕ&Ԅ$M&2r__gHl6a 9S+ 9kr@ *8F)HŴ& v=`M[K%`kT3N*8niTmK^95'++Rd]l (=V=/sFIɧl׌>t32;*NȐYLXƼיeX-`,ڨ;UiʂsiOֵ^`}>ɮ{:X=8=Dt>SA3 Hέ0y\Q/P|k` :@V*I`J.V+rZ\`à#q : W9fAHiYNES2;Pv%SeE2+I!1=H MT#fIb{7h#T1!\@ar +)*3Z npo5ӒK,h)`kZkzJ ԁ3n Hu+HD=C>QcD;hD?_)PvN!8!J PaFp%ˌ 1T]eƢLSgx 9Xno[]x0L+N/ )jVh )r'8 &=)Rlai[ͱTFt&({a6'r$=ͩ%NQ=8MH<µ0hR㬦,.kky4Tm|%}8f4j)vQ }sD2hOQ(q)AhT==hSA0dHBED`YS`yz#sFϳ 1g4irL%!͸GjBkE싹Ƈ;.#R݇&wS5SqʍYk[iw*ԝ[Iػլ;Or[jB׋&H05/fD-@-ѲϏ7ޖJzt@ J\=Ӿ]_>Lo?Mfw|ŷ_Lz`Zyz|wm=v} m1;[zw5=tmWCXiƺPSNmwn9uPiW)5枆bGƓ zaq}9_&ջf׮X?ofk8 {(8*8 W#(^vLi .EvGNz,0#W-;rCĖb{p-4L~ålPJ;0 !LWٙFkOt SuZqS,/O1XV|V\>ݻZ1 hOWS$OuG(ai%>uw%8yP`UcNqE-=!'NXiwLGqw&㈹Y,Wx'&ѕmByFI#Yk'WYӜbKpUS.h@p >A%8ÕC=MγM#Ȏ'UMHcKي ILĩK,V0WT̼(ܶRH< #g\0,f0$wB pl H&U; B|-8^&;|0Gst:,SE= ,+WrRhY Xi,vWݿқXWo\7`!q/DEM'k;u owOe¤={s;g,BiyZN% i%o2mk4ηQ|XL>ͮdrMuCֳshJPvdM-J"-()=NbTaSMe:yH*ٱz EPRA5mHhLj8LMUT5i)`:kJ_U>(g[te8_ئ_X>dJţkXJ#)ѱ &EǵFYHtqz{N M[@7G_wc#= uG.^UxԨHY=7 )ՕrC]~J` dc jX!@>}(VX_b~~cqqѿG9vF<ǽ_1[O2KGH<~wqw wXf_!nUXg%EbXb$+]?MAP䣽[H+':~q2MlKM4ŦDJ u~û)TaMGn1>x#+:ڻ!ڰn)6E<̻%HZ(*ګfBŝ;vAwqE*̗Y!m>lqgSTdxrNf`33Maڅ+}=jv9d,2n^-&IZS/ [M%~z }Yo9ݮɘ&SDAyۭzʽ:YIKo{k"/yxk&3 ֧inK 1df6 \oǙw+hS$e[|׼9%kzjAڴ==_NY؋6fۃƽZ sܟ^T Կ [ } [ }9/Ğ s 狛iPkb'Y7ˢ[ J;E+\*fpQE_huNζ)}䒊Kk$uri郥քнIjڗLW 3;w NIGМL+˥VՕ1A}&@s/~9Tg<}lc_^p3'7~#tK$.ˊ-dA.?73bTy%l;8)r5|>T6='Caos2u_|K_3!5*L<5Q&+Y9Z^>CX~ZH+wہ?m*)rMMDg/(,X{ *}e|b[6OVf )%ʲuqt-d>͍#f%)VW4Z37yfP3THUCpE^;\n"qCp )F ;;Za ϽNŧ}(5E-FXjPCEWҵ*CR fMAjQԂjomT:2ű C>@~yDX& )+4qϝ-M4ǧWjCféF3/1@Q!Q9)!A9@a0%(v/0ZKyT޷7㘳bf/g+z$V0Gpݖ T|kzIƈߑwo~a7s_L x rlL?g;_,ᶘ%]Jeo0RQ$r1{L{)$40-=+Ba^ClmޞD)pLdK{ZAbYk7Fj@km+Gl{dū1> =}AkivKr:A-J}$<:I vl,~E.,V@ܫeeE-쿟Z5%]򗼷.®\HxHxu#^INT@!pq#e!A)nI&nG(arG.g?[Y`ׁ.V8⽞,X_M:t| 4(%qM9QRΛ քZ;?[|}w:)Wyޝ?^E{3l xsCs¢[Z2hh ի}i64 `Xx ]*:Mz.:{p6{¸RQߟpk/0QFk:\M-|IeGVIucQ:@.p܈BoNfW "*}@[h9/MvmdL~yp}ܦ<[w{-(EG,- ԠAy_z8_dhj}^Ogu{';v/sQjgh鐞'_VSaKz>y#T-(q+(541 p'R":%p͜ DC rE JHJY!i_ZaUXpXYRR8xXk ɁjAD;b a Fi#V1-*-.7GPIq+^% R*Nhoʎ=wvw)Z RqZ.|\Q2;6 8E,22{zׄ^9MDeq*0t(Ӱ4BF7sA ׼[Z-,?I~3b]é/kƃpl(#)tXJI{,Ώ]XzAqZ g}' l]ia6&-WF]z,U; #ՁsoZ t.F{ T7ThC.cEWFXhh7A[gLVqdܦ9][ߟvu*, ^Me6 qR/%PzvbK!jBٝ> -~-w??ގEORq?\/ )n*fSCw?'/Ϟ+! p_ 4Q"Z yعd'fLt(Ȋ;Е"-MUyl/;ʒc_ZZHrs_~?VΟccjN^Ē]Or̤;1N@'%/ vk_+qB˔O'<(|X(5׷ij/,ϦxQ "??1A B660 kYŊu{/AM&rܛRA}JoNd0Q,f wħze7jAQWcʑdhaRiye8q ,^0ǣ)(Dey%‰rHÐߌ W$N>y7Yh|9QĜ-c|z>f_ ; eXF,{kc/."s+ڳBIYxT\؛^,?=>~ׯ\1GZLo*NFHuXb&6@7F+}e/zv;!ĐAf2ԪDإqeiLm#J\b0AaݔApWpFISxH\%D8ù!"fExQ-I 05QV.6< $8z+(øYOZGHk]4z԰2,-'r:E2ě#3p=,l#c(1;SUK/F+z 97̌Xo-[YF*xEK- ^;dh#h++7Íw1 gx-q0AUs9ʃ-JQFuAᠺj7Á U䈥m4=4=4TH8Fi`H-|Rc*a66~=:V$FkO(#8WW f*}Cn`/`k!z4^VmW*<(Zɍc!kA胧*'X1^d8d79̤r[ {ۺ\4q1*45D˅FbIr__ܰOӆEPdLPS̈́#28m4*t^sH@ ]8is(a& "f`-Ŭ6$Ƹ8+x Zlͮu qCDI 8 1$2Ac2O4h'Qb;AGI-pxZBSy3օj Lؚ^0~TC~I4L5aՐٜY] ?2G7֐fǦNMt#)GO'm%râaeڻ+O>$4embJ--RPgIH.nR,x+PV#)E2$f (Q2)ߔ1uAMᬿA_Ƕ$[ɛ45Mimt3D\fqŜu8Fk,'0[,Ф)I+M⬧E;b}嘥L v sqmˡO^KK@%/Yc7oK?5f9POڵ#DW6& gT2VR#E U*yԣw?*a3s֣r٭?-GSn¯r8ߛjVAVfR63?#T*Fs~n}SE o!3Ԃ=)j-zp6G@W^2 ^xkj^_[qHs>5GH5Ʋ l&kqL{vVn`ft{!<.nm;xYjzP=e&D~Aj냂Ճ`#Oa'gir⁣)xOvKX/1\8YgW3!JTҖUkV=s* {?\/ղ +?gxu pYOKA$>0ڛWT3֮х(%հ)^ibiBJR~ͽ:K~T9o˵ )/ӴGޝϥZéXi(541 Qv9))9@L+)МRĔ?`moK%XM֊UuqA$>&d%h8>ShuJ,֎1Ry7K^9>u"m5cN *Zй,wp;yKd O{X}7ɾ9^ zPMX;Yk$0= Ӝf eؓ1qC:t61FgD]yAc.Ā#@zˮ%4WQ@WcȐc]yCG}i‹KPneݧ`V[rg-|Ww}*wǓ{kG:J#>bw&sg6dQ}-jXبĥO[R[T O/=7YK=$QF2n%ٳB*+pjG, r0eim RĘ\G3ђlfK+[SHZc2@U tTC*GU|4a,NEkWT?]bBtSLx:&v`'(kcv[uS[$99Ւ D?T . 5|F}?sY,Nf{5I)t'?-:R Ks7*ƾf z&w+L\/e>?--gSV.yZ߹&#ڍ}NR1{n1ݭK>:mg&hƎ<M>Vy$@ؚqGYAw-"nDγNL 2pgg z` pG']b-alTz8w2!Sg"Øro8Tq~9zo!o/s;@`uPm2{DɒPya(Z$ ,]!.xS[x1cPYՙ0?ףi)8Gg%2%<+~彩$–}U!GA( }@ͅ>ޗ&ԯGK,HPav ɐ5,=\`ؘ\°ݵ{xn&#kʘhfqP ^Uxv2Hl;|mm"Қuֹ-CW0a[//a0!_c-Ajlu#hw G7H]hAj *mNu8V l]7lYH16Y% ,흅h蛅D>]ըdٷGẠɽW J^g|LJC\8U͂3~svL/n&>7> )BG5zt 6 t±0M@1bP2&tV75%8kѾ_st"]۫qc8yA[ 8exOc.D㖧Di^EB|T'K.3M61::XtRNj삌LY6PWԉǺFg9aql;utĘz:uf{p:] OVjb*DAD Αx#7 h Cs$ Ɨ(EZeHJY&-'DZXfsXgs1A9$AmmAv(S9@9IK E)ʬY֘9Rw7 |23H)Cd$$ye"-&v! Y@q2ʤ0T%%B3ǸhS z ZdB xDK6xU@&Vh/T9r Z\JfP' b&c1 2)2)aUbqZO|(opۗoW)l`>3{Xq"~1rV$"sIZP'pO"; SY*3z=Tか*U/͏9tӕlm J(¸Z`V ^0ECe+*Ƶf㡋 !{#FKߓ簱g–6o.s89@S΀}KysvYp~2 #E % MˎQQKbׅ]OL7p^L]~mEt~:3=,qkX:fm,X=r׋a>Ofq)%Q?{q` <o#՟96|r>*$9;[O>:`ng5n󧝵8W\T㩄EZ4sw̯umQۣ65| ?eV6x,7gTCރRW_ޞgqOgsC焪K%%rt~_;G$Y濾|igG勥׀|5p[r7! Ѡq_}1p#'qE4餥3d5ClsŹO'3f$o E:@5W>ˡue?BXB` r%#X&62JK+7.F%A m&8*dW{˺VD;wG9@` vd<9 fFfmlu\bf࿳\"R~T ](p_=1ӇŒ#U[h{x;8HǕ*58Ǖp\iE-s$6eJ,~o^ W}`2\̂)M8w 4)\Lfu.3Z:3B(pZ-Le j{[oAd~k *-ћǻҍ/4qpUd炬ȸ{#6kBdg>th'"uwXfTYss~ E~g2+ DtV>]V.}&dyմFGS~N eB$h+)øE2,ҩj)3e rD2Lep{ 5KfNHZ`0R2 kԿHйVfPcQaL^¬P  'f6r#"OsX\${.n%{֒gvfdK(f7M mTX$!vRCn6T-$W1'QPj,nkg9b 5&ϙʞoo^q`?KP&[80e s(`%9̀\D9(/Z% MS^ G7߆0,舺L72{ `4w]"sܭ'$HnlG>*ڞΤ)&xTLs+T fLPyP& RǙ,mr$$0!I/AmzJ|l,Cm`J*WF9)ΖHuauqFѕ cf݆1\2sn.ŒA4|3-ʟ4?!le(*Fb (a,MJ&H[6@_ؓCP$qIϑ< B<% R-aa 0+V8˹PCO7B+$54ú9ImdrH$D9j \Ō(äa+b1`UPZP!ERQb9TJ[ӃXC(%Bc Fs{a)b&]ੴ-\(jbR 7ՀR\[b!JAD|eR{իl=/JހݤdQ_.g5{=^a_y[1qA$CJ0xas)5. T - d-(7JbݿڌroQkȅ(E;`R`-g1vռ"YFO7{[b0;(NZ ^Z+|)o>k? Lm{E}75C'lΝn?\3N;ӃkZ7Ѩ;W87*d MC-p]<>7}pO!`fW+_{ D|?Zp)޼3j~գj9[f(vg,>+U,ן&4 ûik}9-۫=}F  2ͤQ" XH-('ijO Y=.כ4s5jt#u#Hq+ <^D֌:',KaAH)~Y\[") Qg [O$J)C#} 2LSe@t4 6'm&f;QP8# 8ܯdͽ@%NQHV +\bP !b87ĹK_TA waOnv7ͪv B zcX%I3 n6μ^+BWftj, PT/1ʒy^.ϻ7Q_.":^Ks@){ya^Id"w bcq7e'8#κ2asd$*yBeG^Hp?Srɟg<̀L}l[Ke✻uM@2AI0 p607uUu" VZ|MDžJ!eωZvyS0;{7*kt)+++6TLKv×` bn(w*_-7_(z:dw?2`g{fq|u?. ܒf*CC죞 @)t = (SnrTiPI|FPuJ*@-0F *I fro⌗:4X$sx>#͂.c\}).~-5 wLrџ57ϜB2sfbEo><)NZ&z4ۖ1[3Ea>YW'^TG29*mdϟ2H\S[>}qfh--fOl,[TQܿ˦OL[OZm?1A[ູ'mXekjڽ9%m{3t!9 wh4Ӄ+ZNiTPXu !|NZA=*NPB+NZ)4TDQXh1@kH MIJ~NՂ &3f꣇OG0pD! tR1_ƿV\R H\ үsm6MF+7:Y^߹_B-  r#+/]Fr,yc V1)mCD`$"p `RqV}TpcS:iJMj/5%1c\7)%AFL>h/5ig|3#8ˇ􏴳RCD8}SWTRJAp9"!u*1lhBH=NO,C0!uzy$!ur&dSL>;n":12w4nEN2Fۻ%}z,M4xcF$l -I}G6N4ۻ'Dz.,M4Ŧ0Oz7'3M tRĻxFB ZX Q#[r&ަDhS!*z;w?piD,B;v$aF"kKnNdvZ+_ HdPSͱTKPCs+T@*x [b(C1] B]A cuMu7@w$gVǎ:RqFM`#!蟕r&rza{p%")GIa ι"E^%]Ygk)1r^t5exB Gc@L;1C%,cyrl3tZ2E&VS]b[MӉOxV){8 <ۙd:`N= Sv")2W{ ,x(+'Nppʉ_4vQ׼cd8E(d ,([P~M} ds{B=CW`jR ~6}ï\/w;O/i\ËQw]\7!Q}U_H;F5P@9Я4l q)bp%$w Ye H#SLx'wΘL[[dA p&P.Bр@j5ך̈sBۜ[hC`ʩF 4(LF %qQX8Vr: :['xE†KNJKRL"̌2qsH(j$`DK;}ef9tfK rm̍!n$<#.q nZA!]Wt>sckq?%6j)tzdsQ>A$M2⊘][6+* WCnmx<%fH[@if :VxF!bOz6*:pd}i3DHIvZ; 9DKcBXa' ciPN aEiVJLE :]"U`h:*[\ALB,*B TLD B Ra-J.2%w8Cu$4#u:OI|:5os2}Ulb/ZJD4I=rչHdUHʣ@JF0EHq @VXP[=1PBjdXCI lhC@>Qˍ&wK|r2-w]Qni A6܆rICkd:bYsѧv^390*'?2O<0H(s9̠tt&dZCxx(I"_\_\9VJl XBFm LxQ`Q΍@Le{ KR_8mq=fd'wU;X4O^F Xu n0]dS*!y&'% DGH0<Kvt]-4br)SJcPeD24ܒB4>UũÁKi3 .9ST }2ž4L%R~8qQb9WC ABS#2[d/?,']Ddli'x:@MݲHb*F ,!O8yNɒf_`6oHwK q\,c44o#JA)CIVPy-y.o$@^Pm@[u).ҳ+v: 4 )Yd #LHųs X)/ Sh\yQX s!.T0@tA̮0R -Q'f ,;[,Z*V/Fr3X5}^vHn>-&ғwɯH./vWpL40fhFR`S % &4n$3-E*7l?8 =FYvU1 R0f2) la!gnb`ȭF6AXsX!0( 1S}c-AuQ}ei?H;I?l>ǼN׮M둹_{s5.xÏOTp.?^z|\i,raˏ~r6{fb@Sg7,"ޗ} NZ? lջ; xH678u{ϐRvE1c鉎G_JcCqlR|Q\P <-h{SM%2 [! Զmƻqҝ(,g`lɷnBCWUs>Xy.[?^r3~gqY.; ._.YU'-gqGJG\w /(wVtNy{%ásJI6Kth1^@89u^*ɑ:sp8wΝD9E#1 5T1h1L+ERT[2B1 %K11 FP tRxE@B WC{-@jzy9=98<lezmFA "8֟ߛN]FtTca=qr#T}? |(Su<ÃTjYjʩ{CA8JzNbJQD+ JPP -QEi گ4*tl:bYBSQy1rG%PN9bE`rXjtK%!]QtMkQDȚ\&$ńȚpbevquzNJtPnV: P5JЖ3}7݌O/sY |mGEBGqRƽ9,IZzLll'R}y]&D2c4s3%,90RAޑ圉\-#r dbjsf._X$n,p2%NXm-cPb*m!0yg_|N9LB3 Hy& Br㦳OX;Ѽ>8S:gI]`bWZZ#@RB9#*WT؞@X1G\rQHs}' iqp3dkN 0S,NFWP^wS7EFDt^GIW[>$F*~"gEX}gEx^x,)[5xуX#.W>ռK :HvI^rGǑw rQqIVo>hHfmM<>~@t u6+f6/w_ pW||E ݎGm(t暑>!-sYN>ѡ$dr?_6[<\Vf/݋uH_9PaIVwGt1UCUzɃ!F{0u2"G48uo58ҙ6[ooڪȉʹ4\GxNT 1CUfğVC A*)݅FMVoԟۊ'Pē?t>9mU{ܲ°<;xtnݹQjQmlܹ2JJ j"% i麣6Tgv^qDa((0l^ CiR(loTL vL 퍝YI!҉)) 9 }9"N ܍Z#.`L]@ .+Y6ۣ՗wM0¸yOgdӟ`୙\ MM5x1ni7\bf{~{?>y=jtZf,j|uuՠ#4_G;[1q@ԤwOS:gLVJQ6aYWnq4 NL4ө6^>UVL$z)!J-KHUB}/ l5ʹgV4P  0 ->2A82O#883 *,@ MvZ* #ԴFj`e49Fq̈!^i|-zB䷬;{;ln{4V I6ŐrUjqr n<jۚA@\ϚB`~"[:K֎8Hl!At!8e up> )OV8Ale!eڰSSn~kBԾ(N:0EZ[t3pE>4u KsÝIZ:vA}1E]7~ɶ`zWj*q٧rEUF%CШXnG>Z],nƋ-0ckIEcR~j 5ۧ2~WYo@_4} p3Wn_sX:Po *7W私M7Vvln\5XPK'rMPƇh1U!51[&)gn;1B!Z{Lq &M׉ 9HZ%4ySdtA @!CVim b]re8: 92xsxINʳ—83Ѵ$:Ĝc= 4r ^X+ 􃬕 #=?noa  ëu:mIxWH3ӿۅ.>eԅHDvz[~֍tey\}VOMD%L&~L(CIݎU6eN{rqI!~EbMNx<=NX7^/.=)f6eY6# gpkIfrVUzoWC/shS@S}9><vv0u/Rbr4>oi'ɇ@ml 7}YP;X`辎յ2*qa$DVsHp(X7˙煰B2GPן\MK~ nASm-ev/^ Ⱬ #4q'g|ay#s@P㣌R JC!ih?V%=iz; C*dwQ?b.|Ϻu<,F$N: zǧbi|mBTfz2dդvPObxǎK0i MسP'15Tz1;{䰴=Vjtl,+cq]iQI90`R&0<͊Q%.h5H JY?$ vV*mqS\?w/F[=t :pj"HsZ|(j/MƯib% 8/p6D H+mX˽gEl%nrII j#Cf8 &#LWU5ETj%"ri<U R9śg7Ez DK%2IF.yj/XJ+%< OVmlzg0B99F)ml0TQo9Jb{P2zYv3(+BL AYUm^za(C%Ztpoܓ+矶:iFWV; I_?nnx5CTƒ WkymH. ve1l2EQ!,6"dڏi{b}YKO$-[?e0 00/J^Y2"Fh+iHi>cqfSy*u/K9~{rѵ17w/bϾ?~s5px44&az6x>u.| ثOcY[`ȗ?y8msChHs6rǘ FsP9B 5\]xr^irܢM5%B.<\#C灦$\vG^urQ*R45t?{܃k˙ohmi՝{bcip[zʭƏN[DӦ>UTAbA-T;\s֯e%[t9ߊ |0o8AcOҥ"sv5LX}V:x:&Mڼ tDHVvkW_z7t nl#'X8bL)2ygc\raQɊ]$ NLJ1POzʜpDRLI[M6*mͥn ]PHLZiZ(B}4pluؼ부8dH6J^^j9j}i1ᄒo1NYh\9bX﫷,U?i@.LL{qN ^Hv׻'6}l)L!3E.'FYT{EY̔R>L ̯ 9reS>yŻq]HbPt>w: :-m{-ݚ#7Z671"@{G+'Sˢh_}O&4VhhzMVwx}cn Ŋ7 0wOe q3mj^G4TF 7~|L&7k Q:~($0R*7D< çn(gkt|}'0{5aC\)q<B2xN& RGe`&c}K}Ұ|¡F$k #/l5% 7 ;vL%"핤ܜtd7%&oϴO'DljMH㔶o[J_pJsvdzKbqW|0P=2#/ o[ vԧΕtI[:91FZ ?jև;;AMOU!QqVWD7#/>gi o>8E;A ۰\Mf)_6k#T}uFsʏ_vM.t;]RГn?-ʋKETɉ 6OKJB+.z6&yHZ:??N`/Q>?rx2w?xļNER)[z3Yl!X'7^yehjm KON ,8NG=qXiKT\!h"XVl2d.2G!Ʉ(cr `Ҳ`\2# C 'E%3^i`0se]Y,,+U?ːIQsWZeŢ=|Bf>́*}I@<ʧrgȅŷ ito2p:n_t}]<ܞ=2f0 699t,pCu;J]!0DZ ']>/_ԧ>ByPA9n#_;O)%~R35.MW˫Ls"jqH! .ZLa !hqЖU0` .+˫D|6E-{RFUÿ Ǒ> V`y5Q1K1%iPN9w!TF*%=\ yR<~QbhBIFܲ@Ku{LW*ς! Zg0#dԀ:c+S-V1-oQ@XD> ;utzc*ua幈A"֘za13*dي5FE{)gRA H.#[1,aN-$m3D6m8ȿ^-f2I.*IS{qHX%IX)M 8 H!:-P1V2F".1s{שt=},yJ㷳FXpA}%%\JzaI~7`[W?n I27p ""C$+?B<0q<rY7}&[L_ಟ-\\'/af q ԃxE^yVm.p\sSNI>RЮ<ͺuCxjCsYfW-f3l-+ s kZw?!O'BTCU49ȁZ",u7)0C% MIX4;ԝ>X$R@#,R;څO50^qC"G93`X >Aj3CF2Ov1aoLNc"+Pmz7'8D7?]$:mj]5dVs㚛[WP9?`m ,MM'`G~f7J|A8'}|do?x_ƻȜ~c!R!׌;}ma2%8vx9>8pUixk8|c9<`4C 9۳wENh4dS qeP6`P_P=p j_}4-T k˕1pCJb"?_^3~&PNBC,2(wi/H=0nj^YWD_]G':!E1V 3U`V ~q.5_3ZSVhK":jOCԸV~5&pH=^VZkDTU]]U5jl+ &r@!Ȇt@:"gCE'a0YY[+e'(fQ(H#K>R T9ebWmd4PjR\ϙ]. gAD//KǼBI -%W0v][f_`.΂"]cLs+~}K8&(+8*'ʸ[8Rçx\+skeβ더gَ+FV'ob%Υkx?|a@ϥ5k}F8)[Y5zq`8p뙷|PڠDyBи(;hWq; aכԬ}ƝA`or, A 7Pm 0Ԙs}9ڭYMɗx||i'9 ȳ!A9g0L۴+z;r`Yq39KPpvl7[bXu.z~J!7h75?'s+y.CKm+ c%e՟&_}wf-|&n铇kT0hA| W>V}IIOWIgr;?erl|ℸQz?܇E6ͫ޻#o 1GyGNC*nm1ZEe;ݳ}~ \GhrhrH#g9ֳ˗'\㳧| !KZ'鵎!6ڀ Vh$+t< XUmp!w2s-㮊,6wN]`36\=ӭQ^ES2ʣFY--pI g!`F3Y3]s=¥glƻO VvމQi[P3 "tB+ h,6G$\LK< kOcҖ@ovNiuImVtAh-0@ 0ҕsIV cZ }pIA.qMQBr{:[=n7b!vj֣O%f8 ewG/UwGQ ^\<= 3n Ȍ'FrP> M=G>[l4x6xJi rw@yRJ϶BRJ2Ս^Z9| k@m4& Ɗfȷo H yd@1hRon|.!h!3~G딺ǿܠ_ kp皳vRZ!)u[$Qj/6J }Sο-W@>'b5[h%Wfa )gf~Ҿ_*;GQB  A6Xn!K?=mi3\-Jx,xՆYw3Fs6)1vWjsCm,b'ZUID*piYR"ZhT QIrMHr$6_inV=aRpnqKz[[>{|5nDjb7(c:Dd""2NM5>e)g^)1N mfo2%o1ȜO ?y@䇇8[Bt:ӦNk4$y,:.05cUoAHn`v֓?yMKkOMMw[unP:wAt|G+vht(5ԻuhwB^lSR0{fɦ՚ZSύ9JWxc>=G~6{TCXhO/K)-Z:x]k=-Go_H#dأcfr;"wnYW'f8WM-蓃_餃ZlpD&Uhޯ*Z]<;T˽|O~oS'dwYQZ׆Dآge,|.#UO0ӹVZwUH \n_R_}f$cqnł8{'OVȻvY 5NB/$ywa BZcޤ_s =aCVRGqAR2ptԏWg ݀eKCLN8'l*eR {`0"൅jeJ댲p6 A1٥b=,7S50!>wʌ1O _Bv[Ð@*we/u1@{SyUFJؼP>ܺ ű -az39JJO$;Bmt_3El y挓o=\n @w]ıV4=CQ>%xuaݨ"Pnw9˿firuPJ4*0m*<ў}NZJ\"J3%>R`Z丈>5Z{i//yQ=d>_|ӰGVEץX-$e&GvM{] 7ht6q8ȁF։ҷoU|[xy|uZ75]|?j|[ OJm UE#uÇ'cM^u&*;]u&*pi4nQej}zpeG#vG2iӃ87 njҊ/75JEԆK埫o3=3yiUu\;- @j+j8WW|*'ܲK\^L~7j;4?iyMku^ӮkZp%AK+4D "RTAiLyD/fBU)x_7Wsu~\}scA3 VY:`&V5`:*5ᓹmcMO+.B՘e*wY_;KfmWX>u!_~ lRG#ʧ*SZBPץ?-ڷtƯ9/+|,ZCC(vb 0NL^osi&1CYO8uټ'7c6 6z^}SX]'Rˢe{o-jLZ!BjUkչo\_fƠϖ0ֆA#LV&Żh~M:($(3qduֳ+1=Njg%ڲb`3y۹zTCRyxcV5?qrW~0]%A/藲6}ЇZ mՍ@3n"B^ؒ})J;}[#Ai>8!mmhc]ZF8Vq/&gU_gU_gU_gU_U]I*FytcZ2h WMRW2K':2NCH]_ԉ[5~q≶p =\\… m30e\r1؀ז~.4/ S@TBn' ~ހP|Ĝa^ʥxf1%**AaRuJ>:昬 O(=f:%k} ]]f%56쏦[H fvH_$k:S*s9RVj@nyb&ܚQ 6cp&4n*.RY^jUB}V̪@APaLMESh87jz`8mQ2- j0BC{c\@O䶛XHzLIz!$9uB 䈵S#I +y2s9yKqiU4%BR+ˋ\Il\Hn`*(.+Sd12UQ 8D(]_JcJ0%fc xL3z!M}QO}d؈ D!YP +U)VZI2*&΍yF(oR\= PpTsl/b0 f"皗F LQ|gcK (JJ#&y4U瑷I0 W汀wCt3W@9Ebbq̢:D$yq1_{9fi32f_=kfl/ X AƊPcPlWr + ?$V#NbfA b]JjQKy%*%#RDFkȀ9N^J=Ut2 8Tp9峵}[IQ <XفU)+"=@K骃Դ*k%uKjwZP "dJx$G#X _n6w`pYhDZ=cd%KNuWq4VHd)VWi[ոTW)X89;e|TAcwH+~@<*#Q4N55[QM<5r%C .cLH'tD s|` P] ՆēVT[lEha Z۔hISJFSq7R-F:b)ƿC$јQb|p/n(e..r@ ` ˺^wUXP"5\o'PTx 'D5X*㼎6wvaw{<4ٮ|ZvO|z8 ^:̋jf= Q:_I kmDljʂKWY&5G=j њ ZQ˩<޼1aQ~(t1yE7_6`\S@>WzbIm>We62Dn"Cw%{"&2(I-6z8Wk2N!Q1cTo .ޗyIkV*c i|јdFCˉ)^,J\/3t_wGs{@ R !34 nQKp]̜qˎ%itQ9`7xrPG'|G -:*,fwzGeB޾9U=P%K c̣+^]Lc3b(r>Zi-X@ ~]Jywyv_{FiCh\׷l(}w0#-ݩZ6ѦeTѮE0ݩCh6cG6hWY_%ow@@ҩ9WIƙm #FsbiپP 5.YFƵ=UlHb-Kx.Әzjp{R"(?a x8{Df&\AXf?^I16>nH}dY'q3˜mӧ\\^$o, :]y-kꫯw h A>FۄBϿ-U )BB6q.rNרH7KeGwIYU5~ Ft9RmDc퇚ϴ߀|N3wIKi9mq֨)5nWQv~O^O ܺ5ZRN;XՠL[9zuCCqSPՀ3KY^MEϯw|U͎4glM@H":0ԴYzФ97[/ ;=[jP]HrIc8x^"ID)9$I\{*iUޫ6'qyz+FW\+RE< B1@T;IMYxPΝ%_Y/я} f ͞^M2_s7]}IhYxgW^\~yuyeҙ>ͻ/:/ ?n19T;A⊽H6U NJSDVYOhtNH{; 8Z=K8TG>ݖӥk*Ӵ^Ut:jOKP-?Ԇ4JEyP3&5Q8ϝM$*$T)0C)j6FMjR gRL-3"=_j6Pc O4\I=̴CC9>Rr:>BYp(0h` 'u˵Td %{%h@fFWt͐ C6 c۲Avz8 yMFOHm$MZD{lehcn $/rK9;ݠnX;v1@Gڇt4.qBQVۻ61g t HWyeA$`5RRu{Z}Y kIa@>@:ʪ@JFr701N5R9F-T?tل+bR%1]2=vt+@iEQWsqħ21N|ʹKv":/فlg[V jTU eT4|Od;-Bpb}Z\Y}=ߘِ.OOe 8>[`m: 08bjƳ;Pu}ɐKgGgxq ҈.Ow/ H޳bjtgGGT=-GQT5"5QWZ[ev&cZ}P㫤WI_5ոٵJ"˞RV>55 ĉrM}zIں]yb/H[>{C|Lr+ pbQ%>oHFK*(paMx "/0N:B{ ^Є94M^!+BDUGi]g\D/ݡ<9M:%wIz'l~ 7@tɺHxneӪ?5A/gyC6gFn=0,Os4iyhLH0_ݯ^gXq{sycFxї^}Lȱ~n'+w7~ /h&RM2Fmzխmn=w],c0_Q{eKyGcԁ 4L )WVLMz% Bz߼!^1Q2C) dafp 7qa/v NwtRSSvђZ#6%4*ҫ X-I}mQm-/w"©`GvY?WeMgL0*bM~bBWbjUSn8A[ش 2O!%n6h9͢ bD ti{Q'alcS>˦lpa9T 䤗s<͈y*Uϫ~6DLjUe6*Y1{'ƼZa#Rq*x[N/(\ JfzBk,vC3Hg"'hGP*h y\s\bN8fD>JdW$r+L2.4pFUׇh˼ڴ#!d7ϫUxTI7YڐϾf#:/o%(BT|}gP"okRM Ӝe^6vxk˂d۰1D1CD?vg}˛j$t0мDWy!S3Y·l,<Ӹ2z8`lMF/c$jwhA7KȚV̑WqcbR@5ū+Ys5 f2Hu^QOJ3HROp˻g3N ۺݟc2 V3Fפ9hBqʵRh(*>*<|9 dPcem1Z2 <cfTF 3+`W*.-sJ0W[ zMEoA-1Yo# T4p [QCe8B 1Vup4 uJJygQ x`:cB@yw p|Y7% ֩PP* z-SjhU k ƃq٪cUxZGfgTHtf,RQ0ңkÂQc;&8$TLAtZYaPcZEAjQq߂ =^0<2kM|d?hO;Y/~h*~E@KNm-iq MC GuUu`'u׸BFR!v:DWHe.%A!!qY)V - Q f Oh׋?YF8Lz}7u~}"d 7p%2S#?@vatIJl'%=(VV rP\s3Bi"eb~]ؓ4h6w;" :e|Բv{?4_dyݺcJ VC6ƉDzؼ쑳o+r҈}Ee='P|y-o ;Ö$J3lTNJfqK:j22m@+T dUYb0B_يb1p) jDG`cL9 yZ*( #GBii)8kW{'yf p/onL9sv'cИ/٣}\\O:ݧr6Q))8r%4*SU' ܄vTKIII*Cu44[EF ¹E蜖U;PUF9a*ުW܈*+ITxg* OH Ց@q߁Sym4OL :yPL+-3ǏޥdRmO|~w;T8}_h2:~ZP@vR,@/T͆3@/A&=bZc+͈yбfKXcd9s4 #{`}.|d޴avsdRٻ8$W,򈌌4'Y xi )sLRtI^2yq\=K%Okzw qIiܢܸ](rnVOqe0uC3B~{+$j#ſJ}5bl+۷{q7]ZR| {79Yɭ+#7!=wW;zlp}R,ܖOˬe'+gJ}q"1ٳnْڏq)P _˃ tY: Efiz1ߝ,vxdp}{$Xih ׯSiI8͋bvr,rUF>B98}{H!9u:$TEjtG< _*!v`b5U |vv$kN-`l;~}I2nL$Ν| R4R4ǂ8󝪃Adӈ D";8'+S/w\=rץ=DTN_7GT;T⬕+M`? Yl'ˢZ?Il$@ 4ޞ$n>OJ$вh]GɤUz SݢA5VA)pY ,gu1J#Tҗ}VZ((bh:ꐈ əERPYvSzZyPO-uk6b iXSki^u hGB,xպ(ZK*QoXss*#OO/dArC^'٦ ea$,nCj%k/ڭiV׬ׯ>"QKԇZ/[K2.ݣV>kiu:%:>Dk`͆>I/T *#F~[/>,~1܎ OVu8$<жF^=>~K_foOHe9OQ=t~iۗc;Ws0@;NwEYAwe)1n-^Lƭݶv d83/dl`(!M#֚u1K%QQ3ѫKu ]sjobd !;m'E %7ٳ~qǼ8+Bn~\&u/聱ֵWS`a> ګT?͋C@+Tx?׸?kُ wF5 \[zoH=4jTx_o !6{ixǟOZeusJ+:9&?ϧ tykT4wuqqvwWJ"(bҝ1u@;2:Ȼ߿z}gOOի_W_zWF&^E N*3)Uv[Fx#ɌL<+&9H8ӕř7r;-#A: vrlSЧ:$2_ڡboC 'k)|ÉGa4QBtVqTY=#o=6? gCOoĶ#H ]bc_G\u$+._Z؎&P| @y u 5hZFhکd3,4 FM.'TTtLB>\xոE>upd ;%tO/>ux !uTA4HII,-`e)=kZ @VŽQFgZ aL(GMb4#DW"F"qijO9;jSWnQ" c,E8I2JT 6 DpRi#2r~y-RvQHØ);]Nurt:NcFUEt@/&k!, c(Ť&$30@C6wCM%X8/@uٚ> A{ .8@SXqXepw4:C9Y{4 5) ]4z4;Z `槢ֲ ^\- ޯ9BϺ2 416,"ޑcSS+$U,A L!e7CBlActev^Goa>}V8]ysȪ]#70lěb:TA9b4!# 0&؜oa 觳،FYrsdsA?" -=/I9Hlf˩)̶ZZ#ķ0'2yZT2t*,Ut]QBK*gOla ;6퀰0t )p]@Goaǐ4 RS\j8v3[5Ж',47g.|X4G|6F]E.N2"R>ZF4 p u!̶&r) maxb;LvM2XIpBK~{FAK_sc,LSQ;GVPU/ ̦Hըs.Vw\JySvE/ z3ϩN[ YY?O3AUS]9)-^USj VՔn1J5^E4DwExvmZupUNJ8㍒7{60.i$7zM}je}̙iOʁ1ryDqɂ*xm 0i5 !N &TͫOWŚ,wlkv~q|v#׶ ץwsZ՟*~SڂO7ӵٱ{gv9^-ky6?֞e/[k ל~6oB;D/:2Wk jTIەM!G "%E@!8MrJy8+F9ر&gE FQp!rehބwq&YΕBі6am6k|$yN| nҜ W) >}1 ǐ D 1ί`O0['& I =ҭ$]"sE2#"s"2 pД4NNƍ62VK+=XᣎB`D`*"AOUm.2elOZ+@;p3'I臃8]%YujцR'*~.3w(GGʎqMwvi "^-]]Wa+{bf0F6N"&g[&fNO-JyO/ۤ?/{"L^NAW&exSSה0&\ZhM 1՚'re ςA |©c2)+͉TUFyn5qp[pPR%H+\k8>آ D ,BuFS_14Ԇ>J=j)UAk/Tfۋ>kxbOy!4cyJ@>B<[W'rB:8 k?z7P*ͯ?X"=֍62eNnty>Co~bF3SћƢuꬌ^<\:~ڑpp4&zh Ʋ|/$SfLQcG# Źʝ-(QeEdVM:+o4QE!~NsӊckhbHy&ȔQ=maQ)jGßmaV4Ҙ'ҹUn!Z9$QE-Jྑ*-PSQEt)jNӸӸ,!$x* ŵܥ<ϙ KJɜs4wDȒ1Y !Z6sm{ ʚz`C9'0Eo(W_k{=帘I*vm@Y-P m) Z=PzʭcM݃/5;,alʸ።Ͳhu@mE+fH,IXh-8 )0Şyj ^DxXa U$^f դ0Һ[G ʡ`T2=z u b_h#Ms@u8#ŨF )6Bs4?^t~Ef1P{ N|A~GC/GݖH-d#g E}P߅RKE`9; u\-IWj{|ߓ$hAj?\a4[!+xS }U=b6 w6[~,juz_/ 6~N XMM^A].[,;^% p^%DA!Kk@%!&# WBO*@}`7 [ǫ&%sS JBejE.Ј[d.hA@,#!PuYwzΆX竬bVjcN߂tKg(^-Nx0>ԖtԽoH? ./i> }}*|i7:`s{wX /uW|;|'T],b~}^i)Ŀj~/nrf\rrGh -WѩSV-dJׁ`}̝ 8/ԫxa7̭\%g.Q2%avD[, BD'v6tLWjE4J-=.vAĮFҋ>޷vo]RH3b2Ŧ\pH/"v'L`JbGڮ``D7_A8 O)$mӇ}!YI o$=$"PdLAd>Pm=F*WDࣥ#$ WzN>KN;p-efLwY?=8*(Q#qUW-j#)ۛAӕv5Xz#gI-%,yha3`F, pu B #@#*1xh B;(mՖ'@C \R WHJH _Tʜax^BВW/ICֱd4ݡ9SWFS_WsX=;Ԩk%BeI@t,CS:f()K$jsUn *G¨bTE ʢ$iP *1%6~ $>v6;>\G XBBFTÅɷНcn45hY3[(@BBFɔ28hInkjrB/Bj5\E4J y"#S&P̔3&Om =myÑh; Po(yh;ZSJ9-X\g,̱Š]'徨) tKCmHrW QK#$*[M ]C \]d:ofϮ3ٹZ]ď(` >;c)SV=*G]H?5Sb$'h?<޴^:_#k 3^`+cfz#r9~HX"F{3/-vvO{JJ7^:^ h4=bά\? 3_79yAfVHYيc+ienXF7-$Ftps}gp^EF]_RfHԶ޽'=ve7> a`P\]l] ^s U}Ljٗn]=/bIc;][Wۋr㇏?;rvvyQ }t ~`Bmǫfk'^mHNマQźFnliL:[vT0ŧ)Qg IYrඐrFRuc(`IΩXt4ndI](d=.IsPy}舁]],:.T1`ċ^IKz0f ue]~~q0In1Pb~_a9cR+ΤdEE2͵ȌbYUA ,c*i{S{Bq0f@n IA ߿ a51Atw?vb$:%ꃒ0_ P=mBOl{9(NQ B<@*VE xu8)e.+1^“Tf4,}ѥ'4z,E2m 0?pcaѭloD&j#4UlX*T@!t;aWi%|T .A=IezKZ3֜VC7kQJӼ' xX3:OvN˴pX(^o5]$YS)zS BD@ 1/_ŻB%RFb#eUԞɯPw#`?/%/P^[oԫso$|>?bzSǛ?A&]8 ̊*WU.X..YfeJ1X(˫{ykk7޼6ҫ/zXRshtwͻ{/921#HZRNBNJ Td.|x^)Q75dU $h[~B`L>֒9@5i%Ps"Z9u$:Da 'Tn56 (z7M>j5 (/7M>Ǜ&GRcFPQL-qKURԠJuF^ÜHdg$ޮF"?hUzZ'LѪ`Ys8=| EVYT#E|yY0h77!@2.c܁ I^ )nm yBE j-[||I6J.hroWO>~5~\տ!rE8SA @~I) enf=oW]F@PRX_sm~5]OA,E6ᄒ3-iRƇ4N@-~ ˢE̩\cR\ .8ԋĴr o8 ɛ;oow0nѦ_$?R*,בr 5tؑJ)J槠cPp?_x_߽u/'{ ߷YC5A-9M 1fJq[0𬼟Yx#{vNu\:@,S pn:áˠU$ Vs:*z~\'!!ȡ+J"Y }"<1[0NVjfᓮ(]w➂4:Up ˷<Qx` VQ} jx)>2/۶Qv[֧uP_!e>Q߈Fi8 [S{g3y~lUK9^>f UE8UsVd~؇ I?pǁeg..,j?YNP:Y`G!Ne/c{><`7JxV$OɍzO'wm'0؜rROO FK9ߑ鞕l.w|`hmX[yGV.[pvM˲M)\~3\~+uk\~ܜ\~jvac8F++':0-+ :#YLKސv,R3\otS#k_S`QR3ݖ.|{(j C0`uodmɦasw%DT\XG  `0m,4xcqehƲų; ̬k^~z<'_k̨8@َ$dwy[mr>%1G;EP [/3jt}MVllk(3'LZR rˌR#biu|~SZtLq ZVuPTqG D%x~ۏx~ۏ5B:uwAt*؋`+t >M Q0L&)4EJ -Y8 6[[$ٲ0b*){%#I05G*VWt9]J\sUλ*XĔ"naBݗLlZҨlaų1/⥦$Ch| 9kYGWȰn%㱗%(#l*2YO;B:k")WjԞm#jM:Rlm)6nU陧Ԙߐ\faUj*LD©5YFCdϭo?7RU1lIzfN!(QCq&SwZh^WsncԭO]R)pn}Ƕіܱm4}L8c,dND25L9ic]~ {,7/;ie)pN6V@3G´:9Qf1 g^xA-J3YI ~v.ƴt>}7;ݬ7#z4)cm4:xo/&;5jPa[ ']+`wfڕx]8h%Bpj~٧}JC?i_:"G*.@ȝWVwh=&ʝ-<ؠpӱ>oMmMֺؗh#3M͚vk^ ]!]Tc ٬A FO&s\2J_š {.SQ!qaVKD;>^yϸjYc(4"V2g|zmxoxQ }XB* %i"1rA(HdJcLcr6]f_̢㏇Wj8 pdcd5S1%DXl jƔ5Us\{Jma0x=(>9M̊|TH`bl7Zéh쪲fу^ZQTƃ2xIL*2UfR!iʢ%1郬[T$*( DϓMqa+֒h[=7փ ?Inxl{h=jl`ayt4D`=M2RyαHQ- mźco7ZV;Y\";w{YXZ;\=F BWf}߿M'Il&]L-Ard!a~|mkͯӧZiUiR%η]{-cM`9vVJUؔ$pU^xJ2Fv' A܉*g#u[xQaTƯ3F!`#.'vZVLOryt v$H8\[/m)uN3~ݔ5g|-鞵G}pso(ۊSwmM_nm1GoHZFO'~t×Ok-CXkNYsNq%%מhB6Euf>i괨rخ{$g1-P@zIKuXXP-ў=Y ‘#N>XW?BZJg-+!MА|PcMTH/e0NRɂ]>WS !^>4o)S]u m#7o{+?L Z_H|M+=ޣl8.v[ZZbwj-q[{6xY Ff3'nܗ/}1N[$찓 RxZ1 `¬ԫs7.w,x[)zAxf^e{Ds3;h:Ή&r~<8hFC4yɻ/k5eE)̚>z<_f쇜Qc#UF@Gl_P5'[M4ՠ󵚔Bo(ANg U*5G-hCF[b@F h9 b,O\4+$).ɟPi1KE)%-=PXԻ1nL_C;($BV9AUe![DN}6ꔲ15Dz cs*pyζ;w =ն+@Zb{2GB$b31SspM'U6Q=#H4=BL!Lty2=J@rf Kj*#okmGBaНg`zge] l'})9|MY$$zđb}EX,~C(F}W0{J*gv2+L:En^[=l{W]{D dAKǜk{0KMDgvRbKAVӕFb@9m^U'cxE$2#;on ϟ~%0O1b/ಞD̬am]}~6bघ5R̊4I D[R85)4LDz2|v.|mod|` 0`܍yU۞ۇ4|'׺qX'l6H]$PĪɉ(5TLk /=굹PyAID'4 Ʉ$ZE$ d2MI \yץSB 3s}tOVF}j1E} G"F8?Jx؞v 'جNQa RqxsXiC(AmA"^^7-q)%,8:w n/ u2Cn)_XN }ԢFq"Dsjވۼy#wp1 @ ƈw4]\h "9a#yAܒ8L:QtKޑVmI/$ )r B*KDY"ӢH-TF–w{kAi!ۈ!'^Xydy_Cb -m|X-y@<57~eF }ahK5#ǃނ(ѱ͌3(;"f/Ah#=Yx8 1 ;[95% &IRgE/)h*ҒJ2ITDsly)H:FR2s ?iΔL8 Z. FD%RJDqJU^*RJFzSԼ@HO0`u=@-@5;a%f5%T9LSO!Ta>:rmvLw)#[>2BuO:]|Uy+mŗ2ʉY&7W\ۓ{'I k)Llу;1ptt_T" Es1/U]߭'lq<[ 'z <DdD(UgtΧ͋i=m9JA^5òv;;ă;+Zx@1U>nHȣGea 58+uC@Ri=8K480ڄ7/*tT hyuK)Ҷ4{aqWF,H@JMi/2sVf0 1(!9`HLeie!JBr`tR [6 Jq *GcVMBL6YK2IrsGL*^9N)%(0+vthỹfZcc.% e"tO\{x,fgԑ(U.a:5*BrnEӏVGjmj1u@BDwCZߴ8.'SC-9UiQpR78W-I)M9 $QҠ1`Իw̅׷՗@A_A "R $Yg`6&DTx IqD53I+; ( Iv̎ȈSʚ̦ٟ-}y i'쉺7tN=zE.d8snjq{\k퉯 ꇄ0T?nn܊NHEĦҁY+I0 YEI[K;aPϲbT|U# 0n@?~{]"$K\YFi@# R$zߏ?H`\,WEjD]7N]lD{9")Y'+ԭ[}&@Kⳳ*1 s]l[xy ݓ0 =Pwv~<# Q8^*b`Ǻe7nm MP~"yyH Rû7 /p߁|4îǎ,ExŰzR*F́A9f[J8D2nbƝ3_{aP7jޱl8X8zr`Z %&qs ȈbS Sp ƛm1DPxdDRqL _^ј 8(yώDƅ;7<D\vQcrrƈs@@ ,`Lƈox w c1; 7"nItAP]1m889G2^<Ćs@shz.u9{* hT Fcɏl^+_۾1ɟ*&,ЈjȜFM,WO&ZT]%ѿ?1KcҗƤ/&~C&e)+hg9 9唤 C@b2-EA!$ϔ1T|`ynrXV5 "I߆M %rB^O+DUis˪js9[}̊ލl!ʾo{6}պo/-+ظ+A,9`\sJM1N,HI$7uȋ<}+lbFn]ڛj~嚝ʖj񠇣Y{OOpjJSh ˜V/ǿG ,+>,T̮̙^߽?zc @Is LO,j/~GT/i=~7$BBx6Տ뢷'b@>1pD8].ӱ}jMur0R\P}/%%r03G ঋ̵}+*FeLXRYbџ (K zZ <'zJAZQḰhRtra A9Yg"%0(=(Rb[jLu q:sb)%@Ͻ/3LƈPy Q "+:41[kujK1ƴJz(Ta',\'#bg!SH,l[lJg#W\ާ5^ Z7Z.fV?bBI+NMjt+Q9VNݟ E6Cok}|[Ws6ďODFh Pp;l;Y;Ύ /Hq@7JS*E੩Ҟs K2Ir)Ř3/DKʒ('t)}O1ar*aa%Ma +` JV*,d`*2iD䀁RQJie0fD?.&fi<)Fؤ#VdkgRVig&=.}]hh-s{~{?/#a|e[:{0G}ۨ%7zg^^ܪehSi$Ŝ?IvY_o%0N7Ǖ:+^hwj)2Ԫe}RB޸F!5zwvÍÉ!g:2mt zgn,䍛h/jܺܝX#7b٭Lgfw7 (xvzvs`!oDO)LP}mcjcƜ>UjnrRl"'(c"]!]gsOğ<|MsS$l8}Vw6'כZUf1-lU< Oή+keT%ax.wkm)FM7͇"Q`oz82J, :((kfu Xlq1&&V(uBqŨ&e;QXtU #J3dsQ\h䜎1BԌ0(Š.1"0iiiO77U@7UVFZGD% 7E+-ECZq '8EpH8୐JpXZeLZā[1npmu*bBζ_ AiL jXA#Iow:%Immd5l;%AKP'[h{G&6| ƱLNqeQXQ)1?ꌷ&~)8 ED_iEC|X3FET!dV<sӡq✰&LNp, ](@7eאDž]ػ:{17"*"cUG(-tO @JWk$B2Wm<9^5DoMő UK?K޺W=!4ܫv(XrAtA*Ǐm~Wew2ʄ@ )UX-.UtC4Iyj&,iO[({׫sPNbH[ B>~<[IoƲ̈?7} }-4 z:In'Z+=y-'UzcSLQUNLg8#嵺˕'b>+mB*1X'X臆6E/P9ҽ { %$ɫtzF@ߑ]'wa% >ҶQ !F.3v d*ϠT~Q7&zwh(kv]z]܃1Uinz+):EUl]Xſ7*Iolpb_SeקܧˎܫfTG/l ͿE^^%ߖHnÍ^ZKTL$~.~V[rS8;vB޻޿ŬI?ӑ;Z?xNP "RЃ](I^N-rߝ۸nxu|@)tf׌+fQk0|mn% ϳдbw{͘rtm=ĕ0%Kdn?l8^,sh<#V"w77qCED  __ԕJ\Mؕ1=4(2NJS퐸MAY5?Qkjw{l5_SxvKNč"oٝ7Ҡp3,fQ &ySF8ҝYȦQuiיfxDK'T\bh-0U\Y)`y@/L̲}y"|_Fc~ Q2G7ff5!Ѻ%tsČ0J(x7y= o$~HjfI&~h#;_J~ːw| #ɨ˾,U91iIp&JZ/Fz[BmUsj/2t[IY-0n3!j5A5w#>b+谙[ R*mH5 T4J Jf0zdg'Xzv.CmV)Dg(U\t3$Rs$ms^E9BpM\> d+`} Hp]e% mRX2M* ue!$xVbH9* iS!$U.e(4)UD4*'4/I6)B ͺn/f0*ef4LD2" _i/0‘bҘ0FZAB tf7#e>G4e!:ك,ndJ~jr]坍>Ǔ>{r}qi_)UTEtar{_cXEjuw5e?~>;_}OVnDU%&zǏ_+eY~}BTq$u6{E ^?-UoGD4OSq dev&3-!pg&şlyCay?Hl], $}n^N feܷD^">ReŅ7 58Vuou30 nĘ6OAB]W>2NyĖs2Vی"I*ewNH;W'GXLx&a}OF r V| l=[ATBrd@}Ҳcp #}'_80P~ei+S㲥bY!'ʞ3;eha&/{C#L0aNzb_S4IO2.)j/e ʩP^[劼$!)Wpb/m$(֡(i]eH RϮp5#,-7Z{k[^B`Ye=9"8JY9A-k#FͤعyP,BPy]zkm-tW7E)KZl yrfm- ҕ}Fvq9ȕ f(ヂ<`s 'Z25uo to(1dj_j$./ !p8,IVIα`4%WSڙ' P L*cOp͚~%`z |K^ p6%THiQ7Q $ 4/JڠH )$9gAAi*eFA7PA h~UmLt}0>y8Ǝ*76jru%קs e,lٱbhEthrhQZ.ηPW=E@ )@f(=e 'upGwq+ʣG7H(rVhm/ڤiDu^ȉ*)nz(X"V alt(IΣ6"*,zlI@s\t6}9b"8Nu(YsVm7 /v1a_3;y kfsXlZ㾳?b^{ Z3C,L4QL1Wj]y( dH@g*Ij3i-{KKwmDdz(fZ; l"zoq}J8ZM-|S cu ^RmG_M-Ŕ4\ӧ|rӃxM4ATrގqGq3HhTj!Wͪ4@ZA Dk j4NbDLCחd(ڄb<YN }Dل^#8QmOZqcTw(zr ; uՉI18>|)>e#I"1\Hޒ%>9VTuR0X)PlV&)5nv\r6GM@;EvҮ+Q-)'n+7`/zi17H+Uݼu,FfLE({Zo6`t3Fl~c F)r54!6!O׍맷UI*5]76Až = nw:_ Sԑz'WOl gU*lx?>^L0Aݣ~4*tX>=18rO']#oP;wU8*E-:{i/q(%XskDڽJobFB [2G3ܵ`oXqw1bD+ۏf{t48)+9R6YՒZ@58.`idz蔥Zȍ~%Vg\qpEQTVh~N% /g_#PemWM1ftOZC8f~Yvd+u bw)M,҉'LIj!Q!y}7@:(rk`jL@M-rQkJF4Q1- U; rYnWZsFl*\oq )[Ə[ڗ Tsָ$߹o9 ={wXC19SN0/1Qs?<>tbNkw4ݎUOFFt 9a6TFx .*j"잾 e;z5#R7wjf8uzou$1׺1m)zGNPp9@*}60iW? Ʉ W>OظYT3ɂMAR%i k1c ƫ2Hl)G}U*kU%QzG([7HG糭!"V5'a߻W*M]5׽ QRE]wXqubG.RLzP_*N|XqbK  WnzI.7]==8RnQboDOޛ¸m~%Srb&Q 2KXzP?WuΠKo0\5üӃ3FQ{&]' {e;&0WaޞdŃfh4 dߘ<#2 +9DW}Q fhұ5y= {+eb%z|n"$Uz^B:mArVɲ.UJԦDb"@ڢE418m%5?n*1V1=L:Uȏ~۴*ŚZ {YL2_bc>yP~ĝ`-cP҈q }[ 28$<&;:o a#d|n~pD+VlYh[.m9Gom趖f#Su6ASC1: #E|#6'a#^bt7H0 q)`H <>Nɔ)FĐiNVJVJ*ے9n+c㯥Ҏ6OUb#Mx#rtdOI8 TaQy'Y@豪9ZrFQr 7D|(%JjYrCDuBCĕ th7 BpzQ E YZCĥR1:;}S LOk2 K\ӑ8BhK{F\(m!Q`"W&OrbHq(Qn+d f !\ȊY(| sKރ r2 Wv+VOKL) Y*Z!ʃMk|],mZZ G :y];N(a]'yixh1TCD l~sBЂ`"$X'"r!soUPC̃%A<6 % {PTM8 : :Z Yt@+Il!K^TTT\âb.48ЉzrKZ?DmJ5d/A4H"|0rraV#v"Je1ӰN\}wSnv5`OO~3+o;rx̕,-W aDӸت6W2>!z&㽲 8*@Z'r"s^*b{!$%ae>҉ r:WFj&#',I~6:s׉9eZJG%e];TW[8V]/O)y -H7ӓd/aqf22pP5dMxD( :8@wrxﲃ }нcՅRe8h$}yXYS {u!ݧXhSϊZ:֟/?͂MD^uMxq˽Bկu(B[L?,,G{nyNon!3<ȃ˳g 5x*+ Xd+HÒ3V]w;!7)$lt /UU&P 4v?6NB5FMڬzն$ 1i@N|nZ~vQ\Q*C^1:ͰOԄ|#}4#/p !αb9~?a-O13"!Hѫ yNn{L3dcOШ15#hqIhl m 2B'h1 x<:|mLdMh1:`XϓǺ /#@|A83u1f 8ՒQ):4w#Ī*D/R~W D&Q8Ow;3S`ߘ*W\|Pt;,-B  JVÃArK~xWs궱Am.m[W2jD+|4\Jg 3NJEӍ*DXcY57Z[7Sx50@}:E7=ZÕ 3bDgs~>|;pW3=i/Jl6AgX1Ӓۚh3f+﷋twm6bqX8 ~b2wK_IB~DEy!Zٟ_rNڭW?]$`'~_F/G_SໟߝA&ޱ3ı*Yv18nx$`F4$co8$n8b]TʖYVkū*j֓T.KAS, v`6-IM$/BK_zj.ިOKuR4>7f@ җޢ:ƅzWBiQKIR!Ҵ4R͙bWi\Ra.CK_zj)Uk)4-EvZT hWG& ilPec~v@ClR e箮 `X*[T5s˅\XxLBĔm"b1BZ^ώ_‰T_oSLQ7?a| 'aLb$+rYyZ?UY_òb';+տvi9{Hnʟi-2y"|D@K'u3tRKf׆U֪d)Z+nL5sT1D[ BCSb%9fj?G$QLTxYEыKͼ |pCm)P3+@U+pouC"mE %NU,,m=D!3&+c4{o^A|Ny0RuÈո!rXB RX}wήZ≦q%ﴇ\i>ߦf~Ο#wZ?٧dD ou]ڭO.h;ak \YOe~~s( d[q{.hfɷ}AEۇՆs5]ro'Fm29ILLɝ(`[a#O0Xv/+#:RѪ0s<(ŧ݇z~.]+c .0sQ5/9S]Sw?i~QS~ՋH7geoGp7BQϱ P۴klSShg?a{[ğlܐi.Zd]M/7ߏQA~ouOǫi@kS qE &7%i i$줵U8w`t0uǼ[ws+ZH`ELU@m$Z_4V#KyCSt(8'K?:73X6$_c:!l7_?@\}G{mh! -U{4n6j Ix\0)F$+\ Z?hǒq+ŸтsHY aX:[PP#͘LQp' $6$ SLbB'M %;GnĔ5U\T\Tc+z^[A6 (gQrPXᨗ)Hisވg-c3"E81 *ME=W$L` y96`T7"~Uwq"^{QZ}pIլT*gsQE+Y3>5 YZ%Qq&|QwKESLD儁XpQ 6 LӼj\jwNT/m;H(1ёRj$z+q.Q *X8>pz4C,LA9IgT'nVbgxk ͦHu,f uk&dL%jpP C3@B. Q ;}E,`TXtXAp ឆaJJg0Z@کH"zuwYg.r0?fϿ^foł~8ڒx֘GDfXv㧤݊:--at}x\<|(c^Jl0O)o*=|Z !t(x)ݟ~b:s۫# \t7Sҁoc\ `.w`o˶&3d A \AhGyO)xO=8-N;KMOb1r̻ q5gVg&}Q2ĐyQw x[q?&K/._cM_b !9n*%+npjbݒ $N@w=;1"J)~tv9ۉ]% -nipFIߠ2&)?Pq]P`@20?= GaѕyK@ ];3Jp'/O}$1eib'r;]:AʳQ"̏$C4qj/ѐ˖0u)#Di^fQ@*hK1RØFtʟC^F˫\FJ\j۩J:Z rvK5S֒*S`uTF H\]1Z`3$@ƕ ұp:q^ HLQۙ4.Q4KhR}7I]WҘϮhɤnM ' %k~x~OYE&R('Z&vO*HTq><ޔ:R??,4g,~֦awj5@!h[wl&,w'n'&$@` y) 8(Lb ,+ڇ(ؔA9~ [ iWyL^'uiUqX wnTJ<*تe-8Xpc@)l RZOF׎v%t8k~K;x0N<[.n?eI{$\n2p}FO (&.(>vcvϹhB6!]C ڹ 1^?N܈^dl?LGk{rg3}7+.{f5ťiy<<@%_yz}*C lk>?l3~byQ/evG߬ c!3D*gZ(q`u)+&"+椛E@X2[H{ʇzmH+<QH׆4FtՇ#'JBx VB{\u}8I8n6}%Q|q:!T ".@Ϲzg|.6-Z:Bula).RzR֯]12RӤZp.RzR ,MJE]JB5FD]vR*Ҥ(!P6ҵEJNJzYxn)#@<'w~>YƝ!eD1`R3U.RNM:8~ᔿ<_&zK3ì|-|5ڋ"zlgF)/ěu-*\c ӥ"et>mL`U9$&cAzAt%oPpUoP/*(& pq \y6=^4q`ABal† A:skYI&)I&~:Sd2/ 0sd%q<F`AX**06V 5$*+jD>aI&'[0D.?x;+iʸM%qc ӌY #& KZ KQ~UoF9Պs#&srtXFbFj q&i@(41 堀i2l.qo-b^J꜒ @8URC-pޑRܱ5b9k*u R*!1Q;a=pA_ۖ D1Y[ <*=-V5nOfVE)@Uqd\\Jb@ vj׋W!?GVVRc9߂ŗ!0aTro+<|mG&#oW,@Sm]/:s~6p˛:Wq9sxڗqC߾l;gMG>]7^M<560щ)W(`(5YP\tߵu+%^Jʕ܈)FK0J2RPI4:"h0QVFqO!7}Yzz̓3Q3uT;ߓhIbp`)(S&Г> 8?0y')C:#FJ49PM,=Eﱇb*j_e^sAb11sNn ^q3EhӍtj0L?0i'q?(  R/я%"/j0%6hMVAYKê%dsʤlc|?LO\m:[yv ]w yz3nnzk?ͮw.?6yvQ4 S7駇i4GlIw|lİl$4K{Lz#jQO~lVQCzvŦq}4Z|q?f1RLS,<~`w*͞h1H ^{g8 gx(2KG`qڍ$ӽ9 s룠NiF!Qc5\KlNUh\\j͔!?{6D˭9F`[jv;\?EjxּZ *~ySgy-dr?O«?c;?w՟/F#O6䄵J]SY-@>jU5Ji nQ#:R|,StMG(Rq:n]xo-=8P!8&񔔣8'(Rq:nzHSI@|,SE@L2y}91\wiasӐCpbrK馣ri! Q2SZ0~bf#6FZ>Is韺tiӣLGwi@8P2K (YS|i5z@tb3|H!ث˾F[˂r˜ N)U:#RJI5J;n%wLXYh./pi nH/ RFRF¸CMhh.4Ko%kNrƥ|7YT0.R(T@ 0E+(fvP0$Ӧ^)*j/1%*-,J.TE.#@4L,IsFvUnJ%cFG*ۉ|}saj˗gw4jnІeLq~؇͖}X|e'E-zE/; .Ϲ Ϸvuzr`?~c>_] CZ}$ަEAdG[%K>IDEa-!I-c(|Rj#Dz$J';kԯs_IUOT=pj6 kj{mlA-I:U8#TesupCHw3`A;'4v҄M(eՍ($}zX䏅~bK?.#]/tf%IU~z'Q0?<[󬼷>g3C +`pΊʧ{>=os\qEfW5|X,_[>Vd߻ YZg9/̒Oݼ9l[nUmݷxs>J6xQ3sԳmV>FaONjxhm>CQ!4pH~P&`*)!A|Jʀ[v$Jh&sZ8軂j CJNJBRBqx jFL;ANw[jF|*r"JͅHve 7yAS(ZLV-LF>@$ _ń踙, ߊz4 ӥ9#h@Co#1zc h:dqh5FTzXbT!90e*^irA"tѺ6yw)XoEIW^-/\HU7XWK4, Cl&*2/S6D)(Η IX%cRp(y8 )bC<#JT3T~ӷ'Wd])D*ş6rhF9'C \Ǝ= )2>50cFD'k#Yzn%L?+w"@7T75./(yo \7G!.|DtTx-iDtZju:ͥIpzuT<FKu:JM8_Ɉ8IB)nDtS**!\ z5DzKgHPDZ~0E T-/)ھ_!Pm^#qNH&}^Õ##͡NXWpyr2Ls *K>@mjeALA wČN?x8f0 3|,S^SK7ExRP uR#݆an6OT kߌUC>*l7_+ dnQRtE34~Q!e/|5vV>\N|UʺAcS2r\\@}ex|25%wL>Dz+/sv_e}*ɂ\ W\m1jWW8R^seHeM,5hN2rȌ%sԚRᱣO/odE٥=-%iǀ"rHUF?{վ\ret㗪;]\UrY nekzm130$3yWKF5&`zL6h7V-WCY'6L¥%i=MXr'U -(z9"WS5м]õV4dfa`gR_g,oUG,SFؚV Ѷ>xao=02ѭ`ZP#2\5 Yfw58Y]rc(}l9Nup/Ζj04'fļSoLN7['(U" xA'X(wK5荙·0KK>[YSq}۳\)O4"d bܓEJ=+L;%sw#W ˧WBBùb r_Ȣ9r N *TgLz_z .i}|iPPb.Ǻ`YOh?l^uzm{W0&4'1؆UFr_!oC`Krް󾂺7x3|!& 1;FП:B$o~Mj,,c3Zd-Ϸ,f5x_=4T?}j>QM&d%jP͑B v$-@jH{CK&e#G1PZPpkQ& CL)PN:MSBKPcAUq,Z,A֪Q;)w՞oz4f.GC,- -9ۨ[m6QN=qN̈ "Z(nOfAKuZ9.x+4Brt*i=,:ߴ5xm.GdQ̄fvӣ V[洔J;`MT޴*z124ЅV;V+O#>>I%Q8gvCKa"Zb:@KF2%%f899u\".oÀKt U,T)c4"fƗ%9o .17tLINJaJNO0ai 3 Nu_VhhɤibkAPitn5wXs0 5JJJJשV/ ED[l?\q#PPK%q٣w$LU_?RMob xuOmN1||u1xme&!߿[n=YET|ۍ_On}\ 3N,҄p!`X}i|;ĉ¬~uNKt&vhPL?<;+TosXb&*7hMBY$uoewO:`KR̀Y+QxKR_mH͙I^zz^j/\RTe6vғR6CwG6޷nGU?ޚseڐծ pY-7ԠpW&gy! =W7ĕ ?~ 5odRƳjCA&m7Qu r>.a}g8nnkjK(y0 R; wb2@zHi99YF$Z2-@8Gbk(V(Bʿ:(C󚉴 anWp( CfK>qA/aS&4uFyGƝ Z5 wUՎF Q7r@H9)w֏TڱvoeKF wkؐ "hoBk3M`^P7quVGa´"8] -)h}SIUwMۇ?/hS]ۆ)&a7U4MȾm X!M`Bj2 *&[+Dƴ[L-ƛVNx{|-Ro E2Srf1(\4q!wt)R*Z 2Դi ͶUՌ Ho%l%OVDD)q7Պ\ȥZYT++GIkuD1\+#A<"Tm}H!; I4$3əhQ@ za荂Ad<(πdxJ@5~Rk)G+%=+c2Z:B1|-7݊08m•U(뭜Bd 9cDKb!vT`OhwCBJqe{@$ҟF0(Eg]Ԝ3&;҆TtZF9F0H_1GjpdʇVޏ~]mIm:ҟE"R% btqpĨu̚,hƛv^Ϸ͒}Gz!ǴB>$\!sS[ !i=at뫃6ae72fF!ˌnCxE;TDoQ'o&?9o^B!6LQ@!3/L3tydSҗڐ3vf<]/dJq#?_mgԏdCUi6{h;@GX$XIhFLaC6IPvj< {Ff2DoܩCgc\Pz Bkm*ǩO4'>}W—إ?gok*͗0w>"M}nM蹫!G>T~%I _ D$7{TkAT矬v m%r(1zֿۛԷm_בEN(?vcuI5ɇ=Fy`1Og怖y{P:ÇQ<1e-p _Nl ld -71CW-\w[L*65pY6Smj+Kn|e'$A&F4j!lˇp,o9ffYoIn?d4Az JJiGK&NSjkիD?ۨ=T;s 2H#Q;-9LPfCl4P*kaU=[Ylw{՜Vԗ)s.9e+HUh)*Sפ׿/0c[ ^}XyW`,y<6&f|#[૱`(قs&Ô 2B,Abnq l/">]hO(b垠uI-bQ|r}o?uV?ȱ.UL'Pa0CjkO[#Vlsd֊r&Q0 euO/5Yi$8VG{gk]p`;9F)憟5Zi~)&,vk=6 ϴʕ, 1# U&Gf1]!!8vT@3-]XZ:ߒhx:{*BaqR2^r70{)jE~ ! #m6I%8@ N*쩗 t$z Sa(̱Jե丽`uU;ST +霭v [[Y ޺q5c"y$.1b+\[Lj4J1vI[[AEKlyr-pÇx-tb49ɕ_Aȡ4j5SnhxΣ+r OJ%p\*z<3>>-7rҐB`I?kTpz4QQPv2:&]Z84 :Ԩ= G58yt{sYт3߰ w U) wo_RYJ| ݧeD@llճ4­h`k=A~?8HPˢ_m}\o ߭no֟o5=$'.'ɕ_zSLn>ogeda\.Z0'}GzNlUp§(sճv؀jq(ow %.ffŠsqts G5^𙨺f\ '>BhAq5q~_?t$mq'1cHa&, š'7b TQZEXˆϘ2Cy(G;ẫsxW}vh"d\p4:\=?{yJb"<;ZT8ld~su;5*{[Ͼ[j26}ۨ  {;@M]էYotO_~"Mx͏(]MX@նMf4Ps;ʛF9{|ޯ[ܜ d!߸fٔaGwɌ-Fw/ :m-nOm M4˦>sǻ hM͙eb3ͷܔ*w( `!߸vl*\ ~oJ,1ӻsfCahE8 H,\?}bUza?!j'۝[;ij[sVĪ[;AjLsEIWĪ,9e@8@Kȝ͹ʓr+Zs8'ݟ{Jˮ\CkS~SI"g{<*̬_23E b_?qZw~V^tݜ4]*I7݌,aOL9=|ҭ;Rds̐Cأ͏i=e)Ű9 f<(p4:aIqq0 J1hB f̪#0Š {B|5yLЋB@)p(˽jwǼi 3F .fl]ޮ?WnF7^6[RNYEh,Xfg}+|$}LsL>DX<ϋH-<{ڬ#R Tck=$Z #ZGr(ɴ¹"[9vb̪s&8:U, غR1L!}ߕ!X9㵰Of_IGjEwXeDaܕl4'NG?Ws-PJ+h‚ynjgxٲ,xJKdcyd\ trz^ JbEi/_cߑ& rd9A(kGƛOn.ޅ-[]苷  7i7e㋮^~-VjY9B=MY k!F^!q|IhxtlߟԷ݄.U~yiMf]$h]Wp^ 9lk\XFZjnӄ7̠|no; 한uOG^<릟nV1!9X2`ha(/0"8\)?&crWE-C9儎r^3^FZ0]ժK&U5* t*f3S幯PbE8JϬ6X4UPjD 1k4E(#aK*4Ui##{62R!\Y&7uS˂u% Z%=N`j,P2 VȂ}6/W .?wz˗HM(X%_F~F6oW#uH?}[珌s_Q]o5]d%Jf!o(1KChcūە5Kͭ6\\RB^>fw>ѫ¸.(i^I1AZ|)Շ_EwKe ^G^r)VÑ:>sŅď-#Ď MBVX" [Bri9M(+?MFd&⚧'~1:|JШf9 m:- -M\ih3Gw<*.o= eb۷CՆ(n|ʘy }6K#=7ճƪVQQ@nxo-% (G^ԭ5A-!Cy3rGjlU؁%H2W :z,9Pd!,7Ҡ>Rx7qr11ox/zYvl-1һ a!߸fٔGqz7պ3gAT`FR6*t[6{\l6|&ئx$`9n7(Xۥ0}\m#]{#ԭjS6G kν/뒖UK86Ӿ5'!Rfv2Rjs6(SR_vtëܑp>:ë+5hr+)~ T{{$)Y lk"qm)TZT.LǢ gWZrF\L q"F}sPeFal4!ʓp5. cXE\qTP{V[McúK}j.0ZcIX(yV\QK #<8 5:YT5/NR O?.+ׅ F@!['vj? POʩ nF"CM1Є 6hR%EӢZJR𫺐sh&#? >"{Tu_6Kdߜ=&o(%萺K  ySM:CX:8&&켏^)"&{ZXCdBB+&\M$ j|1h`v:G HR@k$vU y0XrN4gqB'O=|Y,uΖ䜭(SFbw?nL_Y]/!'Aރõ(Lx׎M Pޥ,YFѬ-Sp3sS8rsgt+6KGr;&]xWŃ8ǭTplT@I cywm \]ĹD @6WaM`]29rK{s)w|/0ߍ%⼫͸Fz>hvPф*7![>ykC[ˆ)Ď;wc/9cn\n^;1=. :z]8@ e;n}jzloEv1BC՘bW{M[[ hh} kpnC"[#o>D[)&B4Gܖ1C;hz?ڭ7kc*2,q~FK8C 1O "'УȂ$$R𸴖|ʂ|Y42VY42WOIRTKqTKZ}#g$[)ZkReƑ "[,ˬ2/ƫ'gln9QXt9<3sin;4K%O}N-bplCSb܅"Q"yӳHT-Gޅ "hffBN0\DkCtx JNH9~l噡Bz"D@&l~ƃqfZ&iLI?,9g Z3V4 Ggg0)tùIIH}2b\R9Es[@atoM<:R !,1&6`J{%:Ecܬeļn#׾ULeO'&#u c?x'$A!/68` h^@’Rx[u X73w6Oo`5|>  ӗe!"lng-5v`6s x.oKiT n.]Ӎekays*CRnN|c]`©$>{p+ xZa(|$ZbsඃcF0mk-9å\+J)å,Ԩ #VS H$7K͔j. j4L"˔sp`,W>s!S,c>5 xGF=oZa4!qT31J Ab؏sw5'?yR}[=L54b(o o_x-~x5VkzV='Q5%Iu\s` IXYم"G{/w~Fx~At2Fm\C՝p-:,-քvyN[xz0L:ƌy<Z=cR_%'=pVN7T9@ĹYFrsU$}*S<[DO1HH3=YƆ^hc):{&2ƅc.EFaL4sz5IEdBO?U!7!}z^pCSb9ԥoFDW2~_ j :1&q(KdB)0-xfgvobN2MFLI6FS1!XdLvI@8_tb4f Y׆#4D柯?CG_Wfi-],vŇˁz|>5}a|7 /LfpqIgϋן*?f_ߺȇOoݧ̅h~۫Q>/nm߿blFW᥋ϋR},4}Z\Wv8wE@Ƴb-ro~5&;y<.Auds߂+W`UGvHu TV __IG|7o_g:l4{jt}4~ L~B ?'o'r,}4mexM/sk%ǿ-$YV8aVsO?o -͟H G_KG[,AaAMXgMW5+%xѳba8nA#R"eޝEX~$ZO| 5~gÝUj+V% 6r- `9g]l@Np)Je`YeOQ8(q^ >,8N[_ ,;r>"*~`:ٙIMqq}>p>|L&ޟMn3_~lup&ecIfU[z 㰋9h2 IƩl*aR@hLE2JC֙EܯbG=bW j7V3%GM=Bc))gx$STԑ x@T@c4*njkv#H)$:B 9wI44?ؙqPvgbtuU/N5VV+B_#Riz,M*5bPɿĊӵҌlB_޵GȮK]*$ַM+[mH*B_fND$Lk(gD+f=\/ +!)$hLu*Zh7Z F65 KRLz3LS"Y R.턠K(vZaZG{Ra,(C^b>4)0A`Aś3oP DUy}䵽}>.J4TSj}ݔqQ_^@N%ys& ldY,8ᷟFW|5}vN (t~"l~y%7+m=!ОK_ϮJ*S"auUgcϿ<۞W,1rq քC9LooQ_ APuCPÚW&,ԧB}q~*״8\S&K}p'BVV?¯Ծ\vD[z^7ipk.JjVh]`6h9[tBRuLDKŽ"/)b˾jHÍu *_#0hݡ)X D'8#gɔɖw<ϰf u;3WA0=dB9$dÔF5ihM$_lp$x !S%θ1)"`+"˝T[ⴠxJR-VlTX}Y0<+θG:NGp0v\9'.C6>J)%޵q#e/{nU|^-.H@ l{rHߏI==/38AIbUX%♊)G ` L,'Cf|wsCB\ٗ$瓭qe #9Hż\[r-M1yz:rm͒\Kcw)=Ib"0>(r-u"$!Xvs AQ&GC#XD,jҍVJcY))3F4h!Yz A0r- k~Q-WHe2g]`4h,I \yG}ekC `ږ;$ +!~Z W*KĊ)-TH+'smMӘ,li,v{ Z׋|YkM|xZoÒBh3Dj7C圾xw*zQK{FxS,S< a|F' _9hq_Uу?n6|IMCnov$j+i,j%퍴W'8]8s±[NBei51T?) %&`ӟYVc<~煆A5 Z̺gԞ{[_?ZQwUm|yu<ܝ 5b̏srK>Э_\4IZkt33/z"T_ᅵ'R P9biA20Ž+b.j aAJPå`B -*J K+RY@]G b10EgVޅcVJe61g.$- lm!9?) e,EC,k喹}4Ε>TJBiżTxFyհ?WV]lL$az9UZ-}ҭN1+krP'ʔh/sCLr5<9'L# krXpcC"r,rp[m~HrDwz Ⱦ6==t}rЊS.#IUe:7_?JU !]1jQṂc>u2 Pxͱ Hj}_>!Ӟw(ө|}KRqjyxiH4Rmb$C&C1m}P+ ?ǀ3z yH0Tk㴆8D%YZj qAqhO 41%b##J!A@rJr's,\Z(KZY3Ӯ#f 씑VpUpl-̖#EZKw\jY'*HZMsjP{Y@HR[v?U\ZV7K|W]>'tnvľٛwogxo-+CH}mLz 9M=4EQ%u%eᬖU%PSpޙb@eɄ?_XH"Hr5y/v^޽50oػH\.h4;Bz~%htyG7%1 - iE;qR4LQݩK^&"$.$.)Ba.VO, &Kx`l+uIc*wSAŸ-80%Rkػ롌(L]Ȭ9 E?sxgD0G*́ ?8{`k ti6-_KIlj6iB"OU-ϔ)WqET3. ;".&r` r>uKX)RMyIITmj,a`2A0+\ scKo(zp. C H\5kȱwmLMB|#7㐀u$$\$|㸃j=8Єp(-A6<>xS28 MB#no^w>=*&IU 'f6|,]M7K1Tľro" DgUPBJ_AżoTe" f;K ՠ+Dv>qV;9w*' bs6=kT =і91Da%'NhC$P8Mr7Z?}wqrr,[6lO5I;jR-XCY<[+O$/VWw=%N", uqxc\_zޙax ͟g]߱<,]xs[w^>L'oe 3)abʟ\ͧzynsMw&U)' UNB)$A`ݺ Fu#ź(oeh֭;;Һ!_v)~uNuAt}Gu2P~ںudiА\E[U !T>^.u^;HwX'>hqź@i!Ѝ{΄2S$wy+8Ժ'+~xKٻ}(?fN*A]wwݹ́ o_aj`DH1Nx͏n1/ ݍxXiR6ZP l rɂF D osuMvNQ#:Ȩ(bٴ PHyJ~\9e9L7:3i@`# @PJ?Q"oKj Q14ըкQ|9ԂNbY wEYIrqi4.dɬ.dXe^/ɾ ,*m&:G,,Hc!qlaR>`qgi۞bNFܶfb4v'L^i||8s"mDJu!9cJsIcb-7:DK/g Kl'Waq S>N㗢P;L/~ -& kI$XP}#jÙF$S^f ĎJ,bp%#/C s8aJ/X$8T_5ƘՂH-$WJ]M:Z21FmSҸt*(%1p/\ƔhEuGE[Jȓ[jP]E.{D B"'I85秥4>X/NKS}ՠ rqZJ"I 77 SΝ]b[،{$Q+6ж] ww4YנQ5/l_, iL Lab2PYCd{T Z 0=;7)D}ڙ":XZ7,"^a>F}^Zl {1a29 $j# X {eA.guL)4r 4H2 SJm%{ǨV4B[%cq [+K0hZZL20@ o6Q;ޘ#7CIC2"x&j.%a)CMYr BNi I--Lj:ii5#b#d#%(Krl&Ь`,XaVhX*1TT᮰QU-6:1y,2_-Czy0s[枫߆B I 7/U`*-~$"-o*^0fefA|uuuu]pBͨ+<} !}x j5z*S|r?7bΞIb=vCS!Jw??}@rJ VTG]D5(wwgwp',Dr[Ƈ JG-3d1\f.3m {cc- y0⒊t!*JD2Wg-A QQٟ:cqPO.UCV2iSq:H2`'/v _a5t!r6ȋs S*L̈́w3ޓF#+ ,vJozTNo5$J=vzJT^HvN%q1)bΤIJsŋ (b`# ODT 0O\g$Xн`X#2c q,0O$VRp,"B&sM84@S&ҋFm$b$FhQLS4sL5",b3d͞'͍VW+}y-y-jކ"~*GSSZ/9;|~R#^q_ }"p!;3^&3RaN4Wd9 RcfIsj87g!6@E Ը Q+P猩)!\ Wȯa1r25+I"ThHPGTFh)y-ʍXq{1l p}ڼF8P,d/6oxH6+>uVp%цXK/KzFa%$XU t}B)>H`KųJym1ږAz1!W{fIѐL S S$R}@AT"Ck+Ǝ9HJol"%DoG A+E2] 5K6T?4}_xO?O<Q_ʤN^fw tQrW 6Jg es;xJBC1 ܴnj`{MĿe ,3󕿬6M*~vM8\c8J$jY*HutۻYb*H$9!ym?7OzQ+QsWxQY\Qa?_&oB^ ~ GєRNCNvxnL,&*EZ f 51<@ ҄bMi{/&yvG:QeM3'jf|7T'-z$Y$ M l,e8 r&4Ӡ$s{lz]D"_uvzuvz|unFhc"B#3S8X Sgϧc0('w)HF\'IB)֞sxt= Fq3J&9gq_ZVȆV#Wl<|<=)(!Be$(α{bġmXČMfqVk R_afej ! TbH$N? cA./.m8S?Ghymd28uz(~$ ׉Zq(^NVұv!G;j+yݢQ!ѓE:tāLLc$*9C ֊@9)e9r@ONrD M~c b`G$vK>`\> i0LfDIP"H3yg1bL\bJ*y&ıa0H |nۉg'|z;+& JTr-͒2:ljy$a K)s$ m#mɔS&i?3nVwQLv=֤OCn~8QWކ! La_\򩲩EU6pzLA\6߹9y s㝜w}o܈gR`Ilw7ӘRo!%qkJ܏ސԑt J!Ej ۈNyc\uRpc")>Ш}/h$8!Abכ?O$#lJ0T H|*G>(Z,@&]o+s僓Eᶞ6KR"RQHUDi43DUq=xq=OW_[\p].y95#pPԡKǎar-^65V _5J)q4 RBJOA}Sqz1Ivl3TXT' 2%)$Ś$r)fiSw#yԡvS׼ zX/WѭE˷Doyݞ$ƉT(dɆeuك(ϾM`MGӊMŐ6R);wuS{^Ob(}C7|z!G1CO|eSNo)H^hOsyӎ> ]xsM:8a Ej#ZWvk];G%URQ!R& jQXȐ-*툋ܮ%``1:v 0c:_1} ]iQ=Wr*E\0AŘCQ+ioY.4X0ѣ~ukKS-%" z&Rad&llӺ6 p QHUj5EZPWN94gmk=Ҋ#)D hO8ɺ{Zo*_;Vɛ#=r4oOԒC,蜼WK^O @7yRЏs {8?bؐwעVg  ,[`F{ K^|4 m^ w,Rx dԔK)T}UA2KrUe9'C B;C9^:6'>\ݼgD\풧DZ5 6Fkh9Fm;wIH{?7ۚ7| \pl?pLJ(?1uEwt7/zqsqʏr{.~/2{&/w+0I"&+vӑhsr5P]P蠤b7%Fcz{7Ao6+}yx\Y#e5Yo<覹QH -_HVDP͂ JZ*} Yfz.5%/m_!VMĸqXBk#9[i>ðOXsp ͕}-;.']?7c ޳ٓIb>|w=T̿s!_ G[;}zL:i9],Yw||`̒QU{w\0y rdiXtN _z93PR; ;x;*M5W;;?4ޑѯwRNbz" E>:r1GWjHxG EG'C$'TQd0L9du *Pڕ?>Z3=W G)ߊV1n/'SI#Pl@B`b&ˤMM٢!OZYO%E~G$N{#ݹc^= so'5Ro"A׽{iW2ÞRe9׽B{z{>}p \u~H?qP7qawU}'~!U;V~ĩrgvthc~v^װ>9š5hwA߱܂*w. Zm ym8k~\f;L-~;z׺G̓*ƫh-Wl#ؿ :H! S]Պ'8 Y}t"~:SmFSըYc@7ex`It7H>:jקq4a{*/p<~?~0DlL^6&/C&?f%T,3 2vзRjP Q\ D:)GU>vi&~|K>T|-msHu91J1+ZJE:]BY Fk6F%Ôoa9n/N#Jɿo8gup Qi ulX0ݰNX +ӆcA[׍w5>I[<\EvY6wꮇf1=*YJ՝Ǡfzn}಻,FpkOnG$N_g-uس:Vx`Aa}T;kΟ q,.oUo` Yݴ ]e}(3Woee.ke`^ħ^'q;6_ȥ%)y]5()z]z->1*W!ՙ\ǶQ܀@/VnkԠv*΍%x3OSQlCp΂b xrDAx%ck)Vye7f 17fe78 YH$7ZPPD2#tP5=dI9`ѾN[ f/1j9rv *>AuR miA $DU϶V6<Y+ ٫͚?+Jnշ (~Fyh#ܴ.;&ׇ2;;3\k"wl̥lحvҙ,\?aމ0sTIq3Vt3~obѕrk2HKQ}(jէ{qf7ELZ] &C hAc.L[zvē1t TnxӉ!LE-E Fd9lf*8VgMThd^Qs8Z@μH}GJ2TPU~_j7D&vAF/>B: uzfK )몈<29b}0gVJ;vQdN`+-b¡El]p-d>{n;_(rҦ_t3e覥oN. ]p`jm̓$u@}mͩiQ{bE>U$3K c2jBdg${2ڏQZ -.UHƢH椃 ]v>(uTNuT׎f]/Y6&'VCKC#V~Te=6iW"pʶ/Nggz8_!rYz12$y:Dmɡd[=(JfqtZzjj(?<Ԗ.5'y(H kRjO32@ÙEL,YI94f^ڎMdznƐuT,hb$1t9 l=#'񘖳qd@EN*qFXSζWmjtkYcJ h(?=0enW c@n;-`Ž_N :ᅵ*(.:>efv7{cL&xRUb8a#1]D A> 0v X:?H<֔8"%u`XÛFPClٽ:Qg(%QE؋/nb'֢zX\QƝ@o|MU{F[eb .6ZpψL!MX`" ?ѓE_aG=@s *J͟"S=! =:OmI-=ir=äC\.-ڐ䖄Le~а0FhԖqpt8Jh"m 8*[o?X6mZP8@jsk'1ȨJHÝotd #z@h55`&[=y`qk&>{iA5;*ฎJ.wM?9MP潶 6]d6%+5jHA,HW""%ZzpXmiM)#1m b&bȝCh {%>#] A^h|۱X\ON|-e5mTs!`mh0j7785:-뒄xu  ] $-j IQ9&ix:׾%7`wpM0nuKM|2Dz]&Ir)a0F`W8Ӯ/vǏ43O]Y 1e|&f2jm\p!-#q)nxW['7k?4R59KS֍4TqO]U*I!NaຑGk⤀-g=>ԎM}s +}ōb\TGsdRNHq q: 0#p LZb f?ר8͖ntX SAX*/DڀD&Lb4zIࠎRRitMYZTJeqdnCg ZKQOĝ:l~7_Nv/ϚU\25E͜]9(O)CɷY,HI,Er])-TD ƘSʍGE?CuHq F Vs['eHr[Ig(vp~TM$-) |6UNf×¼GhߺfVizۻʍ˟*'ZnӰT{yvSچohdex.rT;/@A}Tb10)p*[qm~;qǎ@IoK *M ߖh\Oe=߿=jo jkl8uTpįJrlzbWݬQ|8'DyՆ`,?K]~(.9'FzO.i! y6Ҙ'jJYj4寯67;~VeZ}qcdռh 0sك  .D=&G~Smvn*H1 1m?Z?tM~pX\9Xu~9ٯ_g ~R͹~m^+D*Iu^d`8!ᰤ^k祖g!EL^}tZ|qeE{49Ȩ/ h)HƷ#ƨ4kճ /9 _ܕ I%| ~+ lb0~zsK7H_ߟE[jo~]&\ɶ*:|Q9v9զ(e Z._C @Ig[TS{}-ib7>p`0g路r2 ]Br#m].4tE]}*eF>42>c`R־(RAqf o k+ L/]IYK_FիiL/i (uc磣*D4 t2]~BMU&H D&ɉHJR M``j.4D}ͤ*Xs?JQůAŨPuP;&E!--HnVuG"CCQܣpDDYViwV 5*=~_e8Fv0=*gzqs%>|Z!O^VJL,IoG穬E ⨆Q}&4`?;"byl }"qu7UaֻD_Nm^Tۖh֓L3e "+:][)3m ઃ򮘍gF}:\{dOg8.ae[%Tiz,ktCK%f1J&ؘ!EA2HA~ WPj?m65LPiaG>+^8QBIY ң|t>7E _ 3T ʠ^k2H<$2+hME(GPfDt2ES-1'#\ RF-`?B &YAťC#"(WT!Fx:΅~Ss~/&MCV%ONDJhR)Fj|V -DWڭȋCv7g_]Ki6:8,fAR2Ѧ$AŅ(B2Ѣ^[47ͨGsX~U:Z|\D?Ĺ U& R 9l\g)a>&߫jC@yd2#*-1.pIK4t6a-i|R[Ґk0k$G0*tԒx '&lBַZќ{ .0'3%(Jr~PBEgpP@z'&Obu/7s$6vEn{vRЮc5Xhvtrsex瘕5D$/4-$O>b=59Ywtǩ .8pk4ĚD̍}{8:m~XkN6om9S=OUU1Ҋ<@FAT 5c驰o qbJɑVQͳQHBD;WfO$8kpXl#h9)WgߠsgqΉ'uQ6b$gL!PFN8ǛlA,績mOGgVAs? xë~ۺW+8WgybvxWz$%udu'/]3LȖu*סks.;ڄ<^YN|#R/.RɄ]s[ ַWA|_PۯvOkjD0څVܹds9FTE_XbG1k轑yDU;+'5 "0t(dg)bL 9ÌkH2i10w9?vvSs-W'ggZGW:Sg?{}e3ߓE櫓ͧ}wK:Z?r6G.̬]s|Ԯo+ȇpD:?2[wzI;q;=$>N%FB@ ^VZ'A+l:QlN+kL*7RM=ک#TH,'VDl ';Ҝ/VZ_sCC`+̣w}9j;8 SYmn6_R)GZ^.@t ֕h,H:SnZW%~8].;buuuz9= WyAV)5jp]uK&F؜?oEkjed< -Mo˜A;e% y턮tM3ɡm|qM<gE,IT9_ZDb-&+R*- _Qcp /bxv,"]aEɞ |t_7ӢnH %6_vy=׍H\kDQP'aGw^7^t6/84-_qI,OEUMkK[vݼ_ [RFôo*T-\ :aaDD1j!zѝ 0,*Y?ϼu_o窖 ɰ(Br%It,z.`ɿ&ʓ@1W#v-E?>&au$f^pm#xm.'!gͽo΃1 )h)>-InQK<2my@λ-Ѹgfܳ ?<<SO=֩s-0`:&zx Ffס&L: 8=xE"; O%a&>oϠTO͜FRж9b, 9-͕HЗ󜰢$H )RAx@k=DUHv73XW+q>vFJ4Qz/x#ummYGi)ڧWw=qת]p%ֽEQyoP:eS) `, $:#Ju_(Sa >$e9'Rc8@p=GŹ<}d$#$u@W5I-ABkm7< f>_ ЀojYm䩶˭"}9=67 $UZ?ƃP̿&XQhŎ($ͯZ1(,oN6joYyQlP-h!'Hk%dur^XjXkVz*u%M},1n>/.6e "I$IUH*5^w4Dp3> `'ӳЎ: ʃ-UrHiXM5yp]@ !v`CU;L3ROnuZ6[6,YfD)Y\LU LKTb?YȨUc$Ԭ=jt1&7,6LHF bw߷Nz.=fc;5P*^E#s,@Z'ߍbC8&f^qM U369ZeF׭G<2ڱi`G GPE`(iZA5?4KCybԭҸXGSZ* _V]3'R]3ػh@{x0/ OpBYGM2DUVcUkS|KQ]Oҟ.4,1°[ +bn^%2=`=7M^ wJ#[LY)_"Eʾ[ {r _D_66ғzRr8tZ̉S"3q.B6P1eA)lNC'vQT[AKs"vl8L{EvJ.{Z,?/TT"),frky5xW>ޑkQR!SvxTvփSn-Mcmvˆ"t(Q (8HgJńRTg VX^εRBBqۮڡ$k 63:Z^KylC:/\ ֿjS@ܔ9zxh#m,}v\EP;F(9KV|gst(ѵr ""IIakEmk2?*b˼EY8WAxP *(K }z It}Cp/:܇A`yFҽCN7[Ly쯬mon*  / Z)5֜-W2f9K]z%Z=eɈ-9IFmzzZ߈^!Lz 0^uOzf_XPxa۳ Ӻ7Vs_ ʽnHD$-_~!ɸ6Rq svU$\4Ut{'nvp]Ά[EQO7\$,3U?RL+#ޗQv.MO݇׊Su9}wX4ѭWqw+$n5СMGvԭҽMA}5 "D}pv^j9~e~v;Wiǚ~bIK$m3L!bNڣ ddLZ9 Lbr]24CE`;}nC_cV%/Fa)(' "]! s&]qܳ.PV.CSzfW C&TtJc+ &JT'1Ĝ1SAC /&}w=0qU֛;Za g_jSy _/b',l^?ldzXUpuvfёn&w>]YsF+|7TW٘؝̋ nn{}&FB"_lI̬<*|b䮼>Qk 1 P~t7nZ_ߦ0ٴn==|U{72oAS7lh5Tz<թP9LM멖P]+l|o %"H_LBZ#;䝾yG8u= V/ < YIHS:fE iU҂?FHAPQҊp|05 <Bu!qilޞF!_7%jgKrҶ$[_Kz %].͜?=Xo@K{ߜYڏ+v?-^31x|]|5T4njwWo!䔰ibgt[dww˭~6øl9o?$M(EM~1.fo_ДIj5"{GR0x9 ;'A9. EEv6}IJ:\*_*&a.?L6fd76fV}7Nt9[!zO퍷ӥ4'nZ#"Wg4o遼yVE,^}o7jiAe&}C2CB^W/ra+ra1-gMsܔ-E`[Px:F61mC*^2V }?տWv!X]x秞gJaSl9˿/9zS_`.([>uZ`U9dLl{r~u3՘0&m% /%wYjZ#94@%~rSjMY&'|~SAڕI'#cw{Nk9j}'W6JN3BH8 h뭍ELc/SWux>4LkyPNɰ*JX{vȣds l>#d@&97*!=%g}װMz}vu[V|դy]ٿ{|2AZBgQIZCVm~zt)n&cD2Dc5ヹeUsPwDͷ=ڑ'_Wkv5Ce)R]V F{cSueә:pM)~dͺ3[*BT'1%ڷuKh݆А):%q[7M1XTN7b!̹dzn&UN R$Foy[}NaoEKy nؕg\A,$g܃Ǭ?lthQTݬʽoONAXd_h{Uff"{,}{4~oPbg%W)}X4>>DQjL/0Z{!ݹr{Y&75O/7aqY z8!cp2_l˅3 PB3u8&Y&terud OW&)S D+9\ⲛ^*0_%'r'*` 8UsQQ` YiQT$D*W& [jփ@vL6^Evcrq6!ڠԿv~ݵLhYzgۮApĆ~h/bX9<% us/a]$NC{3=E$(K -DyB80Z "T he 8VJ>_ ^a1¦B`ˎ;Q[qO1zI4S2);Pzָ:nyu(2̀:Bчص~$-2% @9I<{B#͇|5^q&˵T0u0\i@c}/;B4[q< y`A8!'7aTKRHG2'Bwy[ PҔ9PJ#(*ad('lceZjJLe),!~c:UIvyUآNV0- &UZ*H^9"EBuv]3͙}ghc 4ҭȲSP[8f%HG-yQtzbVdOA0Ce%*RRqCqVX0˝1?ՆR8M PB5408L,ƝSjS6q ;H^vZ$Y(3#psU^"%?+q$ ,GVT@Y*@8yE%Ik8wi4QTͣ&_`:|4&ѣ`5YI;S$cEEPMJ0ƸA&:H Ma FOf9ZTKٻ_s%uܩL܄Oܔa|.qZ#6"Z瀴b76^ L ˈʱA\@^/>GWPqLﭷ8N@z٣4B,|f@4$^1JSN@Q(F\-](#˵mݕls3f+/W?gְqI2St OGb%6c?* $].X3Q~A;o.q4j 4K?d-;lY~cX5pO7tWjAi=TݵD\HC^cCuZTк DuRcVzf+m y*_4홢nS6<:zAti$9C9CdnBB=/kl`4{T|m:(- ܿfwUR6#p[-ZS}:-uj?v TtYpUeJ'ۭ"bX%5-eo$%!<{u%\/ ۿ%I<w 70߄ioєoGc6Gˢu&"6Gu(W9r pw}/ZU%\ Z C)G9_ wMRM TTPM )ق+}rVֵB,T Eğt`p ltb /S_x[N|Ka96ki%0%c!¿ګX_E( c!>Bf{F"KQ2\)NN;IF}&*vzWcVVќ4WaBi %UѥrI5ʩ5U WE`mdaN9.8:N88RE \'9BPm )|uLz),䍛(ju5C>n^w+>m]O)z),䍛hۦ(i6ظN+?=V9>V2T't>Ht٠*zߴy;M;}ϨEjܜJQfI_3*1zq[%-U2U2zҫHFWP/jK?EϢP})0 )4ʒdX)VWf`qciWԌ).a!jW<8hSrlwF7 Dg3JaR'é"0xtX;Z];B!rJlKֽKg˗BF//Ωl;4\V]qwO\oI{i`m&c'K8, 3yOƾ\ObDjBw/LIcW7fqRSUwrso6?g 4tB68ɆER:ySW. >gh_8qTxؾ>C}A Bu>-RKp7lן?~OXK'`  ~ꛞGb{?FZ n@+56 ,s2'Iy7*.1kЀGO9k%TUu(|2a xZ%oxV&eMZ>{qz`_k.RԉwuR5rr9+V LjnL_T!||߇Ujm$o7QXm~ h5$o`bƹL-V҉Ƭp. fHۅ7=Z5jLMYuι#X@ ZpS\myIE0v+r]͔z+lzIy; ꎵ=H" 8Y[kj HI5)e4^D崏 +$n1:V,`Bbщ #k$՞C8Ԓ7"SN4xNmr3*oj#T޶+!ʠ +G ڨh:@BI -zQdO<-e3_jj<˞JIoO}eƜX2f`[ƞu/B0t{i܃M{N&U ou؃=؊5DrjұMMF6Ua›jVqe=T%0y{gk9{b p Hlf>W ECMs!ߥg'( Mv`Fij?lW' O'z{ Y{JUܲK,q]t`zȕVY55'k\hR8{[?'E}Y3'f-"n ?8i]ØF*~}0}ńD]OO>} !k@SߧCV+v+ecF<uZ@{7en]XB=c0,ôz (5ʠ`;j,7:Ur6B1Pc y&:ʦ >{7բ]һA|G/Xvu[ۻ{Om y&:ʦȬ>7GX\b//vOP҅J=Z)IL78niHlt8 Lv0فJֲVBd y[u3(a4G-M2X+`iwe~ܴ_81 MGȫ( v@s+2k6eVʦ)GBs:AH}xVīyn۲" <jֲrV*uJ[<.jf^)RF|]uΦ #>o#uR.ڪ^BNP\fOˏHuвhgM0,|Ps]w3Ƿåς&ϼcGƐQJi/!xF-F,n(XXB)S.5WFeɤ? 8~~w|)zkozveuӿRuծ>Z񏣂akq ɾcЌB M;~Y<*7C|z<2 k Fb< 7FTV7cڜ`m^VQWUBxFޗj-yݳQ/5<ӠHa2o0h\ybhX*@!`a씣( v2dp` +ocRV[d͐cjdTOZhfNJ͌yn#PH{lAEgl :Ժ5HN1H FT_qo({i,+ήQi,5PKшle.rȈ>\MvO {\㾰҉&;L4QN>c֏O*nnҗq&\z $8\(hE ^bs1KQ}xxkls1 o\CywGaل'dJ$m(+F\Y\nQ:P v2`+4f(OtnubBַlWaPuQ[sv+-Z"m3AG[uYh)B3+*BǴ3"  j3c-X;0$n1Y*s:Au ] _X% k>6|meV [$ JˬtHܲҲ^6 -Jzԭ{ #>]:|)FutmLeVʴRER.S79vP_ *;ePk`+ѭՠ!0(pHׂCkX[C~pbv cDi3>͘­-9'hYf뺸?<)pjXukJfE\H]go CD3sSN_b~Y 0pR˗oyT^]DeAH1Uu/*,yݐ.%j[n:#jG.r d[\gbYjE( PVJD?>-dn1Jc$,=[XQ4)(Z1J>"J`n .I6yϐ0fc !^Pr`IAUA"$ӔGkGT(Dj­5T="a3[a,EWB--eqRpi$-4j%;}ˬKSXA192.168.126.11:10357: read: connection reset by peer" start-of-body= Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.610879 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:41686->192.168.126.11:10357: read: connection reset by peer" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.610986 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.611179 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.616952 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.617025 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.617104 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.617998 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.618271 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a" gracePeriod=30 Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.255911 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:04Z is after 2026-02-23T05:33:13Z Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.463894 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.464589 4781 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a" exitCode=255 Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.464667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a"} Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.464706 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a"} Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.464828 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.466031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.466070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.466088 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:05 crc kubenswrapper[4781]: E0227 00:06:05.146621 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:05Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897f1b368abff74 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,LastTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:05 crc kubenswrapper[4781]: I0227 00:06:05.257308 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:05Z is after 2026-02-23T05:33:13Z Feb 27 00:06:05 crc kubenswrapper[4781]: E0227 00:06:05.550524 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:05Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 00:06:05 crc kubenswrapper[4781]: I0227 00:06:05.552807 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:05 crc kubenswrapper[4781]: I0227 00:06:05.554432 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:05 crc kubenswrapper[4781]: I0227 00:06:05.554513 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:05 crc kubenswrapper[4781]: I0227 00:06:05.554539 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:05 crc kubenswrapper[4781]: I0227 00:06:05.554587 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:05 crc kubenswrapper[4781]: E0227 00:06:05.559314 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:05Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 00:06:06 crc kubenswrapper[4781]: I0227 00:06:06.258669 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:06Z is after 2026-02-23T05:33:13Z Feb 27 00:06:06 crc kubenswrapper[4781]: I0227 00:06:06.759702 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:06:06 crc kubenswrapper[4781]: I0227 00:06:06.759913 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:06 crc kubenswrapper[4781]: I0227 00:06:06.761324 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:06 crc kubenswrapper[4781]: I0227 00:06:06.761528 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:06 crc kubenswrapper[4781]: I0227 00:06:06.761721 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:07 crc kubenswrapper[4781]: I0227 00:06:07.258620 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:07Z is after 2026-02-23T05:33:13Z Feb 27 00:06:08 crc kubenswrapper[4781]: I0227 00:06:08.258603 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:08Z is after 2026-02-23T05:33:13Z Feb 27 00:06:08 crc kubenswrapper[4781]: I0227 00:06:08.308670 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:08 crc kubenswrapper[4781]: I0227 00:06:08.310051 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:08 crc kubenswrapper[4781]: I0227 00:06:08.310190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:08 crc kubenswrapper[4781]: I0227 00:06:08.310315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:08 crc kubenswrapper[4781]: I0227 00:06:08.311180 4781 scope.go:117] "RemoveContainer" containerID="2c8810f20fd274cb743f419a437c4cab58c3ca4bf18ab25919217a2e2cb4c3b1" Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.257816 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:09Z is after 2026-02-23T05:33:13Z Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.480742 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.484050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f"} Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.484269 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.485319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.485368 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.485387 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.257881 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:10Z is after 2026-02-23T05:33:13Z Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.489823 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.490583 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.493737 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f" exitCode=255 Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.493796 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f"} Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.493845 4781 scope.go:117] "RemoveContainer" containerID="2c8810f20fd274cb743f419a437c4cab58c3ca4bf18ab25919217a2e2cb4c3b1" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.494074 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.496171 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.496228 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.496248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.497256 4781 scope.go:117] "RemoveContainer" containerID="9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f" Feb 27 00:06:10 crc kubenswrapper[4781]: E0227 00:06:10.497687 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.827588 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 00:06:10 crc kubenswrapper[4781]: E0227 00:06:10.833688 4781 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:06:10 crc kubenswrapper[4781]: E0227 00:06:10.835005 4781 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Feb 27 00:06:11 crc kubenswrapper[4781]: I0227 00:06:11.258489 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:11Z is after 2026-02-23T05:33:13Z Feb 27 00:06:11 crc kubenswrapper[4781]: E0227 00:06:11.386170 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:06:11 crc kubenswrapper[4781]: I0227 00:06:11.499755 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 00:06:12 crc kubenswrapper[4781]: W0227 00:06:12.215852 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:12Z is after 2026-02-23T05:33:13Z Feb 27 00:06:12 crc kubenswrapper[4781]: E0227 00:06:12.216376 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:06:12 crc kubenswrapper[4781]: I0227 00:06:12.258572 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:12Z is after 2026-02-23T05:33:13Z Feb 27 00:06:12 crc kubenswrapper[4781]: E0227 00:06:12.555801 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:12Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 00:06:12 crc kubenswrapper[4781]: I0227 00:06:12.559909 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:12 crc kubenswrapper[4781]: I0227 00:06:12.561130 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:12 crc kubenswrapper[4781]: I0227 00:06:12.561173 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:12 crc kubenswrapper[4781]: I0227 00:06:12.561190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:12 crc kubenswrapper[4781]: I0227 00:06:12.561222 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:12 crc kubenswrapper[4781]: E0227 00:06:12.566292 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:12Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.257614 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:13Z is after 2026-02-23T05:33:13Z Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.282155 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.282398 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.283419 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.283693 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.284124 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.284362 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.284588 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.284907 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.285002 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.285031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.286003 4781 scope.go:117] "RemoveContainer" containerID="9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f" Feb 27 00:06:13 crc kubenswrapper[4781]: E0227 00:06:13.286426 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:13 crc kubenswrapper[4781]: W0227 00:06:13.764066 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:13Z is after 2026-02-23T05:33:13Z Feb 27 00:06:13 crc kubenswrapper[4781]: E0227 00:06:13.765054 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:06:14 crc kubenswrapper[4781]: I0227 00:06:14.257469 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:14Z is after 2026-02-23T05:33:13Z Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.154047 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b368abff74 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,LastTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.160726 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.167200 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.174115 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.180207 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b37067555d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.380839773 +0000 UTC m=+0.638379327,LastTimestamp:2026-02-27 00:05:31.380839773 +0000 UTC m=+0.638379327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.186726 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.409728921 +0000 UTC m=+0.667268465,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.192786 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.409745722 +0000 UTC m=+0.667285276,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.199240 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac3cc7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.409753442 +0000 UTC m=+0.667292996,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.205035 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.410482951 +0000 UTC m=+0.668022505,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.212714 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.410500902 +0000 UTC m=+0.668040456,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.216729 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac3cc7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.410510492 +0000 UTC m=+0.668050046,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.220958 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.411119779 +0000 UTC m=+0.668659343,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.225162 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.411134939 +0000 UTC m=+0.668674513,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.230617 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac3cc7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.411146569 +0000 UTC m=+0.668686133,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.233756 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.411623492 +0000 UTC m=+0.669163036,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.237817 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.411663813 +0000 UTC m=+0.669203367,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.241686 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac3cc7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.411672434 +0000 UTC m=+0.669211988,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.245727 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.411751676 +0000 UTC m=+0.669291230,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.250331 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.411768796 +0000 UTC m=+0.669308350,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.253820 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac3cc7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.411777586 +0000 UTC m=+0.669317140,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: I0227 00:06:15.254473 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.259542 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.412463035 +0000 UTC m=+0.670002589,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.265818 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.412481435 +0000 UTC m=+0.670020989,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.269662 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac3cc7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.412489926 +0000 UTC m=+0.670029480,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.273493 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.412521297 +0000 UTC m=+0.670060851,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.280416 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.412530517 +0000 UTC m=+0.670070071,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.286454 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b388dd45a6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.791222182 +0000 UTC m=+1.048761756,LastTimestamp:2026-02-27 00:05:31.791222182 +0000 UTC m=+1.048761756,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.292952 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b388de8923 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.791304995 +0000 UTC m=+1.048844579,LastTimestamp:2026-02-27 00:05:31.791304995 +0000 UTC m=+1.048844579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.298596 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3890d2d3a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.794361658 +0000 UTC m=+1.051901262,LastTimestamp:2026-02-27 00:05:31.794361658 +0000 UTC m=+1.051901262,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.304617 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b389501661 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.798746721 +0000 UTC m=+1.056286315,LastTimestamp:2026-02-27 00:05:31.798746721 +0000 UTC m=+1.056286315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.309331 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f1b389e84946 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.808721222 +0000 UTC m=+1.066260776,LastTimestamp:2026-02-27 00:05:31.808721222 +0000 UTC m=+1.066260776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.313811 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3a9ceeb1f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.343929631 +0000 UTC m=+1.601469185,LastTimestamp:2026-02-27 00:05:32.343929631 +0000 UTC m=+1.601469185,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.320421 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3a9d04ec2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.344020674 +0000 UTC m=+1.601560228,LastTimestamp:2026-02-27 00:05:32.344020674 +0000 UTC m=+1.601560228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.324717 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3a9dac146 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.34470535 +0000 UTC m=+1.602244904,LastTimestamp:2026-02-27 00:05:32.34470535 +0000 UTC m=+1.602244904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.328671 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b3a9db2eed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.344733421 +0000 UTC m=+1.602272975,LastTimestamp:2026-02-27 00:05:32.344733421 +0000 UTC m=+1.602272975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.332367 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f1b3a9dbe38f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.344779663 +0000 UTC m=+1.602319217,LastTimestamp:2026-02-27 00:05:32.344779663 +0000 UTC m=+1.602319217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.337139 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3aa59d999 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.353034649 +0000 UTC m=+1.610574203,LastTimestamp:2026-02-27 00:05:32.353034649 +0000 UTC m=+1.610574203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.338423 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3aa6fbe11 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.354469393 +0000 UTC m=+1.612008947,LastTimestamp:2026-02-27 00:05:32.354469393 +0000 UTC m=+1.612008947,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.343401 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b3aa95f3cc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.356973516 +0000 UTC m=+1.614513060,LastTimestamp:2026-02-27 00:05:32.356973516 +0000 UTC m=+1.614513060,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.345522 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f1b3aab2dd3e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.358868286 +0000 UTC m=+1.616407840,LastTimestamp:2026-02-27 00:05:32.358868286 +0000 UTC m=+1.616407840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.350266 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3aac10b24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.35979754 +0000 UTC m=+1.617337094,LastTimestamp:2026-02-27 00:05:32.35979754 +0000 UTC m=+1.617337094,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.357425 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3aac2bba9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.359908265 +0000 UTC m=+1.617447819,LastTimestamp:2026-02-27 00:05:32.359908265 +0000 UTC m=+1.617447819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.364305 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3ba40187b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.619782267 +0000 UTC m=+1.877321821,LastTimestamp:2026-02-27 00:05:32.619782267 +0000 UTC m=+1.877321821,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.372081 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3bae8e7cc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.630845388 +0000 UTC m=+1.888384942,LastTimestamp:2026-02-27 00:05:32.630845388 +0000 UTC m=+1.888384942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.381508 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3bb01792c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.632455468 +0000 UTC m=+1.889995032,LastTimestamp:2026-02-27 00:05:32.632455468 +0000 UTC m=+1.889995032,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.389017 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3c793112a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.84332369 +0000 UTC m=+2.100863264,LastTimestamp:2026-02-27 00:05:32.84332369 +0000 UTC m=+2.100863264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.395974 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3c856d6ec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.856153836 +0000 UTC m=+2.113693410,LastTimestamp:2026-02-27 00:05:32.856153836 +0000 UTC m=+2.113693410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.402901 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3c86713e0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.857218016 +0000 UTC m=+2.114757580,LastTimestamp:2026-02-27 00:05:32.857218016 +0000 UTC m=+2.114757580,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.409867 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3d3d2fa7f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.048838783 +0000 UTC m=+2.306378337,LastTimestamp:2026-02-27 00:05:33.048838783 +0000 UTC m=+2.306378337,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.417248 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3d4d51145 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.065752901 +0000 UTC m=+2.323292495,LastTimestamp:2026-02-27 00:05:33.065752901 +0000 UTC m=+2.323292495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.425854 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f1b3e44be5b0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.325198768 +0000 UTC m=+2.582738322,LastTimestamp:2026-02-27 00:05:33.325198768 +0000 UTC m=+2.582738322,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.432742 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3e467cf76 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.327028086 +0000 UTC m=+2.584567640,LastTimestamp:2026-02-27 00:05:33.327028086 +0000 UTC m=+2.584567640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.441009 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3e4d9e564 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.334504804 +0000 UTC m=+2.592044378,LastTimestamp:2026-02-27 00:05:33.334504804 +0000 UTC m=+2.592044378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.447745 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b3e4e61f1d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.335306013 +0000 UTC m=+2.592845577,LastTimestamp:2026-02-27 00:05:33.335306013 +0000 UTC m=+2.592845577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.455423 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3f1452e4d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.542862413 +0000 UTC m=+2.800401967,LastTimestamp:2026-02-27 00:05:33.542862413 +0000 UTC m=+2.800401967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.461971 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b3f154b4aa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.54387985 +0000 UTC m=+2.801419404,LastTimestamp:2026-02-27 00:05:33.54387985 +0000 UTC m=+2.801419404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.466363 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f1b3f15b1bca openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.544299466 +0000 UTC m=+2.801839020,LastTimestamp:2026-02-27 00:05:33.544299466 +0000 UTC m=+2.801839020,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.472162 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3f22d2e13 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.558066707 +0000 UTC m=+2.815606271,LastTimestamp:2026-02-27 00:05:33.558066707 +0000 UTC m=+2.815606271,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.478447 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3f27e2482 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.563372674 +0000 UTC m=+2.820912238,LastTimestamp:2026-02-27 00:05:33.563372674 +0000 UTC m=+2.820912238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.484660 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3f28a9169 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.564186985 +0000 UTC m=+2.821726539,LastTimestamp:2026-02-27 00:05:33.564186985 +0000 UTC m=+2.821726539,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.490825 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b3f2917949 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.564639561 +0000 UTC m=+2.822179125,LastTimestamp:2026-02-27 00:05:33.564639561 +0000 UTC m=+2.822179125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.497748 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f1b3f292160d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.564679693 +0000 UTC m=+2.822219247,LastTimestamp:2026-02-27 00:05:33.564679693 +0000 UTC m=+2.822219247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.503147 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3f36fcb47 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.579209543 +0000 UTC m=+2.836749097,LastTimestamp:2026-02-27 00:05:33.579209543 +0000 UTC m=+2.836749097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.507535 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3f37c9b8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.580049294 +0000 UTC m=+2.837588848,LastTimestamp:2026-02-27 00:05:33.580049294 +0000 UTC m=+2.837588848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.512559 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3fd529034 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.745066036 +0000 UTC m=+3.002605590,LastTimestamp:2026-02-27 00:05:33.745066036 +0000 UTC m=+3.002605590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.517570 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3fd5dca22 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.745801762 +0000 UTC m=+3.003341316,LastTimestamp:2026-02-27 00:05:33.745801762 +0000 UTC m=+3.003341316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.523415 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3fe07df6d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.756948333 +0000 UTC m=+3.014487897,LastTimestamp:2026-02-27 00:05:33.756948333 +0000 UTC m=+3.014487897,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.529781 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3fe25438f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.758874511 +0000 UTC m=+3.016414065,LastTimestamp:2026-02-27 00:05:33.758874511 +0000 UTC m=+3.016414065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.535623 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3fe2b5d55 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.759274325 +0000 UTC m=+3.016813879,LastTimestamp:2026-02-27 00:05:33.759274325 +0000 UTC m=+3.016813879,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.542569 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3fe4638cf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.761034447 +0000 UTC m=+3.018574001,LastTimestamp:2026-02-27 00:05:33.761034447 +0000 UTC m=+3.018574001,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.548098 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b40ad10aee openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.971458798 +0000 UTC m=+3.228998352,LastTimestamp:2026-02-27 00:05:33.971458798 +0000 UTC m=+3.228998352,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.554578 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b40ae7b516 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.97294415 +0000 UTC m=+3.230483704,LastTimestamp:2026-02-27 00:05:33.97294415 +0000 UTC m=+3.230483704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.560370 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b40bbcd71f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.986912031 +0000 UTC m=+3.244451595,LastTimestamp:2026-02-27 00:05:33.986912031 +0000 UTC m=+3.244451595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.567421 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b40c0662d0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.99173192 +0000 UTC m=+3.249271474,LastTimestamp:2026-02-27 00:05:33.99173192 +0000 UTC m=+3.249271474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.575081 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b40c1bc877 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.993134199 +0000 UTC m=+3.250673753,LastTimestamp:2026-02-27 00:05:33.993134199 +0000 UTC m=+3.250673753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.585868 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b4151132f5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.143435509 +0000 UTC m=+3.400975103,LastTimestamp:2026-02-27 00:05:34.143435509 +0000 UTC m=+3.400975103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.593135 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b415a3d83c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.153046076 +0000 UTC m=+3.410585630,LastTimestamp:2026-02-27 00:05:34.153046076 +0000 UTC m=+3.410585630,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.600517 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b415b065dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.153868765 +0000 UTC m=+3.411408319,LastTimestamp:2026-02-27 00:05:34.153868765 +0000 UTC m=+3.411408319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.607267 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b420a61040 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.337740864 +0000 UTC m=+3.595280418,LastTimestamp:2026-02-27 00:05:34.337740864 +0000 UTC m=+3.595280418,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.615912 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b4212cd882 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.346573954 +0000 UTC m=+3.604113518,LastTimestamp:2026-02-27 00:05:34.346573954 +0000 UTC m=+3.604113518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.621479 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b42140a529 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.347871529 +0000 UTC m=+3.605411083,LastTimestamp:2026-02-27 00:05:34.347871529 +0000 UTC m=+3.605411083,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.627058 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b42d170f33 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.546472755 +0000 UTC m=+3.804012309,LastTimestamp:2026-02-27 00:05:34.546472755 +0000 UTC m=+3.804012309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.631868 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b42df4c946 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.561003846 +0000 UTC m=+3.818543400,LastTimestamp:2026-02-27 00:05:34.561003846 +0000 UTC m=+3.818543400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.641665 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b45dc59444 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.363216452 +0000 UTC m=+4.620756046,LastTimestamp:2026-02-27 00:05:35.363216452 +0000 UTC m=+4.620756046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.648565 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b46a41eeaa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.57269265 +0000 UTC m=+4.830232214,LastTimestamp:2026-02-27 00:05:35.57269265 +0000 UTC m=+4.830232214,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.659041 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b46ac45b7e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.58124019 +0000 UTC m=+4.838779754,LastTimestamp:2026-02-27 00:05:35.58124019 +0000 UTC m=+4.838779754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.660727 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b46ad6d0a9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.582449833 +0000 UTC m=+4.839989397,LastTimestamp:2026-02-27 00:05:35.582449833 +0000 UTC m=+4.839989397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.666824 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b47a3bb42a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.840719914 +0000 UTC m=+5.098259518,LastTimestamp:2026-02-27 00:05:35.840719914 +0000 UTC m=+5.098259518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.675904 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b47b4aeb83 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.858494339 +0000 UTC m=+5.116033943,LastTimestamp:2026-02-27 00:05:35.858494339 +0000 UTC m=+5.116033943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.681370 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b47b63a5da openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.860114906 +0000 UTC m=+5.117654500,LastTimestamp:2026-02-27 00:05:35.860114906 +0000 UTC m=+5.117654500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.685504 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b489a0a2ec openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.098992876 +0000 UTC m=+5.356532430,LastTimestamp:2026-02-27 00:05:36.098992876 +0000 UTC m=+5.356532430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.690636 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b48a571d6a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.110951786 +0000 UTC m=+5.368491340,LastTimestamp:2026-02-27 00:05:36.110951786 +0000 UTC m=+5.368491340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.697994 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b48a66dab2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.111983282 +0000 UTC m=+5.369522876,LastTimestamp:2026-02-27 00:05:36.111983282 +0000 UTC m=+5.369522876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.703003 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b49767c3e2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.330146786 +0000 UTC m=+5.587686350,LastTimestamp:2026-02-27 00:05:36.330146786 +0000 UTC m=+5.587686350,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.707918 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b498649389 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.346715017 +0000 UTC m=+5.604254591,LastTimestamp:2026-02-27 00:05:36.346715017 +0000 UTC m=+5.604254591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.712526 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b49870965f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.347502175 +0000 UTC m=+5.605041739,LastTimestamp:2026-02-27 00:05:36.347502175 +0000 UTC m=+5.605041739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.716559 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b4a488ac9a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.550407322 +0000 UTC m=+5.807946906,LastTimestamp:2026-02-27 00:05:36.550407322 +0000 UTC m=+5.807946906,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.722900 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b4a5479826 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.562919462 +0000 UTC m=+5.820459056,LastTimestamp:2026-02-27 00:05:36.562919462 +0000 UTC m=+5.820459056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.730910 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 27 00:06:15 crc kubenswrapper[4781]: &Event{ObjectMeta:{kube-apiserver-crc.1897f1b6a4b6e421 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 27 00:06:15 crc kubenswrapper[4781]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 00:06:15 crc kubenswrapper[4781]: Feb 27 00:06:15 crc kubenswrapper[4781]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:45.143370785 +0000 UTC m=+14.400910369,LastTimestamp:2026-02-27 00:05:45.143370785 +0000 UTC m=+14.400910369,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 00:06:15 crc kubenswrapper[4781]: > Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.737100 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b6a4b9c006 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:45.14355815 +0000 UTC m=+14.401097744,LastTimestamp:2026-02-27 00:05:45.14355815 +0000 UTC m=+14.401097744,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.741773 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f1b6a4b6e421\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 27 00:06:15 crc kubenswrapper[4781]: &Event{ObjectMeta:{kube-apiserver-crc.1897f1b6a4b6e421 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 27 00:06:15 crc kubenswrapper[4781]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 00:06:15 crc kubenswrapper[4781]: Feb 27 00:06:15 crc kubenswrapper[4781]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:45.143370785 +0000 UTC m=+14.400910369,LastTimestamp:2026-02-27 00:05:45.148316555 +0000 UTC m=+14.405856149,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 00:06:15 crc kubenswrapper[4781]: > Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.746552 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f1b6a4b9c006\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b6a4b9c006 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:45.14355815 +0000 UTC m=+14.401097744,LastTimestamp:2026-02-27 00:05:45.14851044 +0000 UTC m=+14.406050034,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.752409 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 00:06:15 crc kubenswrapper[4781]: &Event{ObjectMeta:{kube-controller-manager-crc.1897f1b6e8b538a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 00:06:15 crc kubenswrapper[4781]: body: Feb 27 00:06:15 crc kubenswrapper[4781]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:46.284112036 +0000 UTC m=+15.541651590,LastTimestamp:2026-02-27 00:05:46.284112036 +0000 UTC m=+15.541651590,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 00:06:15 crc kubenswrapper[4781]: > Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.757528 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b6e8b5caef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:46.284149487 +0000 UTC m=+15.541689041,LastTimestamp:2026-02-27 00:05:46.284149487 +0000 UTC m=+15.541689041,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.762152 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f1b415b065dd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b415b065dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.153868765 +0000 UTC m=+3.411408319,LastTimestamp:2026-02-27 00:05:46.4018509 +0000 UTC m=+15.659390494,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.765736 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f1b420a61040\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b420a61040 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.337740864 +0000 UTC m=+3.595280418,LastTimestamp:2026-02-27 00:05:46.584296783 +0000 UTC m=+15.841836347,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.766857 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f1b42140a529\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b42140a529 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.347871529 +0000 UTC m=+3.605411083,LastTimestamp:2026-02-27 00:05:46.596296309 +0000 UTC m=+15.853835883,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.774359 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b6e8b538a4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 00:06:15 crc kubenswrapper[4781]: &Event{ObjectMeta:{kube-controller-manager-crc.1897f1b6e8b538a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 00:06:15 crc kubenswrapper[4781]: body: Feb 27 00:06:15 crc kubenswrapper[4781]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:46.284112036 +0000 UTC m=+15.541651590,LastTimestamp:2026-02-27 00:05:56.284743507 +0000 UTC m=+25.542283091,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 00:06:15 crc kubenswrapper[4781]: > Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.778735 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b6e8b5caef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b6e8b5caef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:46.284149487 +0000 UTC m=+15.541689041,LastTimestamp:2026-02-27 00:05:56.284801488 +0000 UTC m=+25.542341072,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.784255 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 00:06:15 crc kubenswrapper[4781]: &Event{ObjectMeta:{kube-controller-manager-crc.1897f1baf1765d16 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:41686->192.168.126.11:10357: read: connection reset by peer Feb 27 00:06:15 crc kubenswrapper[4781]: body: Feb 27 00:06:15 crc kubenswrapper[4781]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:06:03.610856726 +0000 UTC m=+32.868396320,LastTimestamp:2026-02-27 00:06:03.610856726 +0000 UTC m=+32.868396320,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 00:06:15 crc kubenswrapper[4781]: > Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.788588 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1baf177bec4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:41686->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:06:03.610947268 +0000 UTC m=+32.868486852,LastTimestamp:2026-02-27 00:06:03.610947268 +0000 UTC m=+32.868486852,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.794489 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1baf1e71776 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:06:03.61824447 +0000 UTC m=+32.875784064,LastTimestamp:2026-02-27 00:06:03.61824447 +0000 UTC m=+32.875784064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.799278 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b3aa6fbe11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3aa6fbe11 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.354469393 +0000 UTC m=+1.612008947,LastTimestamp:2026-02-27 00:06:03.631720944 +0000 UTC m=+32.889260538,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.803525 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b3ba40187b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3ba40187b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.619782267 +0000 UTC m=+1.877321821,LastTimestamp:2026-02-27 00:06:03.836852833 +0000 UTC m=+33.094392427,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.808508 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b3bae8e7cc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3bae8e7cc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.630845388 +0000 UTC m=+1.888384942,LastTimestamp:2026-02-27 00:06:03.847694718 +0000 UTC m=+33.105234272,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:16 crc kubenswrapper[4781]: I0227 00:06:16.260807 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:16 crc kubenswrapper[4781]: I0227 00:06:16.284261 4781 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 00:06:16 crc kubenswrapper[4781]: I0227 00:06:16.284485 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 00:06:16 crc kubenswrapper[4781]: E0227 00:06:16.290037 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b6e8b538a4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 00:06:16 crc kubenswrapper[4781]: &Event{ObjectMeta:{kube-controller-manager-crc.1897f1b6e8b538a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 00:06:16 crc kubenswrapper[4781]: body: Feb 27 00:06:16 crc kubenswrapper[4781]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:46.284112036 +0000 UTC m=+15.541651590,LastTimestamp:2026-02-27 00:06:16.28443244 +0000 UTC m=+45.541972034,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 00:06:16 crc kubenswrapper[4781]: > Feb 27 00:06:16 crc kubenswrapper[4781]: E0227 00:06:16.296964 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b6e8b5caef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b6e8b5caef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:46.284149487 +0000 UTC m=+15.541689041,LastTimestamp:2026-02-27 00:06:16.284534943 +0000 UTC m=+45.542074527,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:17 crc kubenswrapper[4781]: I0227 00:06:17.257972 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.259120 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:18 crc kubenswrapper[4781]: W0227 00:06:18.502034 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 27 00:06:18 crc kubenswrapper[4781]: E0227 00:06:18.502118 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.704983 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.705222 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.706670 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.706724 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.706743 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.707487 4781 scope.go:117] "RemoveContainer" containerID="9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f" Feb 27 00:06:18 crc kubenswrapper[4781]: E0227 00:06:18.707822 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:19 crc kubenswrapper[4781]: I0227 00:06:19.260025 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:19 crc kubenswrapper[4781]: E0227 00:06:19.565945 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 00:06:19 crc kubenswrapper[4781]: I0227 00:06:19.566966 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:19 crc kubenswrapper[4781]: I0227 00:06:19.568481 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:19 crc kubenswrapper[4781]: I0227 00:06:19.568529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:19 crc kubenswrapper[4781]: I0227 00:06:19.568551 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:19 crc kubenswrapper[4781]: I0227 00:06:19.568591 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:19 crc kubenswrapper[4781]: E0227 00:06:19.575698 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 00:06:20 crc kubenswrapper[4781]: I0227 00:06:20.260038 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:21 crc kubenswrapper[4781]: W0227 00:06:21.004232 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 27 00:06:21 crc kubenswrapper[4781]: E0227 00:06:21.004312 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 27 00:06:21 crc kubenswrapper[4781]: I0227 00:06:21.257932 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:21 crc kubenswrapper[4781]: E0227 00:06:21.386829 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:06:22 crc kubenswrapper[4781]: I0227 00:06:22.257440 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:23 crc kubenswrapper[4781]: I0227 00:06:23.258534 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:24 crc kubenswrapper[4781]: I0227 00:06:24.258497 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.259517 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.274456 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.274888 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.276009 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.276040 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.276051 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.583079 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.583359 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.585540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.585566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.585574 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.587337 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.256556 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.542569 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.543344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.543365 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.543374 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:26 crc kubenswrapper[4781]: E0227 00:06:26.570383 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.576558 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.577396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.577417 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.577425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.577443 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:26 crc kubenswrapper[4781]: E0227 00:06:26.580447 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 00:06:27 crc kubenswrapper[4781]: I0227 00:06:27.259062 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:28 crc kubenswrapper[4781]: I0227 00:06:28.257482 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:29 crc kubenswrapper[4781]: I0227 00:06:29.260492 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:30 crc kubenswrapper[4781]: I0227 00:06:30.291854 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:31 crc kubenswrapper[4781]: I0227 00:06:31.258512 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:31 crc kubenswrapper[4781]: E0227 00:06:31.387675 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:06:32 crc kubenswrapper[4781]: I0227 00:06:32.260885 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.258009 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.308890 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.310686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.310748 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.310769 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.311596 4781 scope.go:117] "RemoveContainer" containerID="9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.563425 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.565695 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5"} Feb 27 00:06:33 crc kubenswrapper[4781]: E0227 00:06:33.577158 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.581067 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.581886 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.581929 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.581943 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.581969 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:33 crc kubenswrapper[4781]: E0227 00:06:33.585392 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.267573 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.571175 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.571992 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.574207 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" exitCode=255 Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.574260 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5"} Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.574294 4781 scope.go:117] "RemoveContainer" containerID="9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.574453 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.575610 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.575663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.575673 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.576136 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:06:34 crc kubenswrapper[4781]: E0227 00:06:34.576319 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.257502 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.579317 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.582475 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.583940 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.583980 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.583989 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.584499 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:06:35 crc kubenswrapper[4781]: E0227 00:06:35.584706 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:36 crc kubenswrapper[4781]: I0227 00:06:36.260530 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:37 crc kubenswrapper[4781]: I0227 00:06:37.258414 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.257305 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.704901 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.706043 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.707797 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.707849 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.707867 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.708730 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:06:38 crc kubenswrapper[4781]: E0227 00:06:38.709010 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:39 crc kubenswrapper[4781]: I0227 00:06:39.261229 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:40 crc kubenswrapper[4781]: I0227 00:06:40.258096 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:40 crc kubenswrapper[4781]: E0227 00:06:40.578505 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 00:06:40 crc kubenswrapper[4781]: I0227 00:06:40.585733 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:40 crc kubenswrapper[4781]: I0227 00:06:40.586847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:40 crc kubenswrapper[4781]: I0227 00:06:40.586894 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:40 crc kubenswrapper[4781]: I0227 00:06:40.586905 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:40 crc kubenswrapper[4781]: I0227 00:06:40.586924 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:40 crc kubenswrapper[4781]: E0227 00:06:40.590503 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 00:06:41 crc kubenswrapper[4781]: I0227 00:06:41.260400 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:41 crc kubenswrapper[4781]: E0227 00:06:41.388582 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:06:42 crc kubenswrapper[4781]: I0227 00:06:42.260327 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:42 crc kubenswrapper[4781]: I0227 00:06:42.837024 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 00:06:42 crc kubenswrapper[4781]: I0227 00:06:42.849549 4781 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.260141 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.282490 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.282816 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.284127 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.284183 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.284209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.285009 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:06:43 crc kubenswrapper[4781]: E0227 00:06:43.285266 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:44 crc kubenswrapper[4781]: I0227 00:06:44.258370 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:45 crc kubenswrapper[4781]: I0227 00:06:45.259701 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:46 crc kubenswrapper[4781]: I0227 00:06:46.160920 4781 csr.go:261] certificate signing request csr-dbkk9 is approved, waiting to be issued Feb 27 00:06:46 crc kubenswrapper[4781]: I0227 00:06:46.169778 4781 csr.go:257] certificate signing request csr-dbkk9 is issued Feb 27 00:06:46 crc kubenswrapper[4781]: I0227 00:06:46.252314 4781 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.127183 4781 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.171179 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-07 02:32:55.977923198 +0000 UTC Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.171230 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6794h26m8.80669634s for next certificate rotation Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.590669 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.592552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.592694 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.592715 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.592847 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.600996 4781 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.601080 4781 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.601094 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.606414 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.606472 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.606486 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.606504 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.606519 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:47Z","lastTransitionTime":"2026-02-27T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.618135 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.625815 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.625879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.625899 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.625926 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.625945 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:47Z","lastTransitionTime":"2026-02-27T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.637211 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.643021 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.643223 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.643259 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.643289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.643311 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:47Z","lastTransitionTime":"2026-02-27T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.653802 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.662599 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.662650 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.662662 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.662678 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.662689 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:47Z","lastTransitionTime":"2026-02-27T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.671602 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.671843 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.671868 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.772775 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.873908 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.974374 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.075461 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.175877 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.276688 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.376978 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.477243 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.577552 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.677666 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.778388 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.879284 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.979516 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.079702 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.180758 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.281499 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.382711 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.483095 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.583506 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.684602 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.784922 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.886085 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.987090 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.088112 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.188870 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.289588 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.390543 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.490719 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.591523 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.692169 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.792767 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: I0227 00:06:50.872867 4781 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.893037 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.993126 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.093320 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.193775 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.294696 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.388731 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.394807 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.495673 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.596991 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.697289 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.800326 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.901416 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.001964 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.102913 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.203452 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.303578 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.404271 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.505321 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.606392 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.707341 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.808402 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.908966 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.010059 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.111165 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.212064 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.313058 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.414176 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.514531 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.615707 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.715829 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.816942 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.918314 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.018506 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.119551 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.219989 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.320673 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.421757 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.522711 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.622800 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.723930 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.824616 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.925706 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.026061 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.127017 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.228096 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: I0227 00:06:55.308452 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:55 crc kubenswrapper[4781]: I0227 00:06:55.309980 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:55 crc kubenswrapper[4781]: I0227 00:06:55.310179 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:55 crc kubenswrapper[4781]: I0227 00:06:55.310389 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:55 crc kubenswrapper[4781]: I0227 00:06:55.311511 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.312014 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.328290 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.428902 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.529266 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.629696 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.730567 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.831357 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.931651 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.032812 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.133433 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.233874 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.334372 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: I0227 00:06:56.415240 4781 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.435184 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.536101 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.636286 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.736822 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.837586 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.937998 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.039032 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.139533 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.239915 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.340352 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.440744 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.541683 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.642506 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.742936 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.843813 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.944889 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.017927 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.023052 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.023285 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.023482 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.023708 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.023929 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:58Z","lastTransitionTime":"2026-02-27T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.038716 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.043408 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.043659 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.043811 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.043951 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.044078 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:58Z","lastTransitionTime":"2026-02-27T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.059808 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.064418 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.064484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.064511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.064541 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.064566 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:58Z","lastTransitionTime":"2026-02-27T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.081460 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.086613 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.086729 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.086756 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.086785 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.086808 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:58Z","lastTransitionTime":"2026-02-27T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.100193 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.100410 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.100447 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.201186 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.301703 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.403315 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.504002 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.604987 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.705162 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.806337 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.907712 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.009068 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.109879 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.210394 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: I0227 00:06:59.308770 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.312554 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: I0227 00:06:59.313763 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:59 crc kubenswrapper[4781]: I0227 00:06:59.313813 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:59 crc kubenswrapper[4781]: I0227 00:06:59.313824 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.413361 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.514347 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.615069 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.715946 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.817136 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.918093 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.019031 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.119595 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.219984 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.320763 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.421127 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.521221 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.622357 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.722772 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.823831 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.924962 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.025070 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.125717 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.225995 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.326923 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.389604 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.427576 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.527814 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.627966 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.728540 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.829706 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.930741 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.031100 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.132093 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.233202 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.333538 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.433930 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.534864 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.635774 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.736253 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.836575 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.937132 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.038246 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.138347 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.239502 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.340041 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.441025 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.541158 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.642011 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.742113 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: I0227 00:07:03.750489 4781 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.842721 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.943805 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.044091 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.145264 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.245816 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.347040 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.448100 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.549014 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.649924 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.750299 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.851381 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.952342 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.052819 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.153297 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.253789 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.354670 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.454957 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.555601 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.655916 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.756041 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.857124 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.957829 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.059009 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.159878 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.260394 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.361231 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.461786 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.561970 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.662672 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.763514 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.864416 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.964679 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.065389 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.165689 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.266606 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.367580 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.467903 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.568123 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.669230 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.770307 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.871132 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.971960 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.072586 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.172883 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.273430 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.373839 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.447433 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.452252 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.452315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.452327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.452343 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.452354 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:08Z","lastTransitionTime":"2026-02-27T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.463225 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.467362 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.467400 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.467416 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.467432 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.467443 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:08Z","lastTransitionTime":"2026-02-27T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.481857 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.486288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.486344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.486368 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.486396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.486417 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:08Z","lastTransitionTime":"2026-02-27T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.506369 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.511733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.511804 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.511818 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.511836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.512168 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:08Z","lastTransitionTime":"2026-02-27T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.527291 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.527519 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.527574 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.627977 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.728821 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.829864 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.930388 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.030559 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.131257 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.231408 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.309197 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.310575 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.310665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.310685 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.311658 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.311940 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.332448 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.433131 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.534229 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.635314 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.735653 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.836604 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.842778 4781 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.939149 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.939200 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.939264 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.939287 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.939308 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:09Z","lastTransitionTime":"2026-02-27T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.042884 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.043143 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.043250 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.043351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.043512 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.145816 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.145881 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.145898 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.145922 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.145938 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.249535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.249597 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.249614 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.249663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.249681 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.310830 4781 apiserver.go:52] "Watching apiserver" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.316135 4781 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.316511 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.317152 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.317311 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.317347 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.317523 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.317600 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.317677 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.320701 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.321431 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.321566 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.322432 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.322803 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.323280 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.323442 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.323763 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.323773 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.323908 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.324122 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.324780 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.351894 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.351939 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.351956 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.351979 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.351999 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.355028 4781 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.361918 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.376861 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.392071 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406245 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406303 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406340 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406372 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406404 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406434 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406469 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406499 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406529 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406561 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406592 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406622 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406682 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406714 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406744 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406746 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406776 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406880 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406922 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406909 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406958 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407105 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407200 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407194 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407283 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407317 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407358 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407402 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407447 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407489 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407528 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407559 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407661 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407733 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407776 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407813 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407848 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407882 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407916 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407948 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407982 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408012 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408047 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408081 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408113 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408179 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408218 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409281 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409333 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409369 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409450 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409482 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409514 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409546 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409700 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409739 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409771 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409805 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409836 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409871 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409904 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409938 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409970 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410006 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410080 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410116 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410150 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410185 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407238 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410218 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407586 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407797 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408046 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410257 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410292 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410329 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410359 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410391 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410457 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410488 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410524 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410562 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410615 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410705 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410754 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410806 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410849 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410885 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410919 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410954 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410991 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411023 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411054 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411086 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411118 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411190 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411226 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411292 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411325 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411358 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411394 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411460 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411498 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411531 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411564 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411598 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411723 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411775 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411960 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412014 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412058 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412108 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412154 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412203 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412256 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412302 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412350 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412397 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412450 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412488 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412523 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412558 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412625 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412710 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412748 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412783 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412817 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412856 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412889 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412920 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412955 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412996 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413140 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413176 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413207 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413240 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413275 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413312 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413345 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413379 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413413 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413445 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413478 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413546 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413582 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413618 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413709 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413747 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413781 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413817 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413852 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413886 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413923 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413957 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413991 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414027 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414063 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414097 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414132 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414164 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414198 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414237 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414276 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414312 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414348 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414383 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414418 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414452 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414486 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414521 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414556 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414595 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414677 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414729 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414779 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414825 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414874 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414915 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414952 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415005 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415041 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415076 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415113 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415146 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415182 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415218 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415264 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415300 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415336 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415373 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415410 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415447 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415482 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415517 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415553 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415663 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415734 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415771 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415804 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420101 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420168 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420225 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420278 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420323 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420378 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420425 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420492 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408270 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408449 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.425925 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.426454 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.426611 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.426684 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427015 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427129 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427300 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427417 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427464 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427545 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427696 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.428465 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.429111 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.422006 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.429850 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408495 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408990 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409188 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409409 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409850 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409854 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409874 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410194 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410674 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410878 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411074 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411414 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411465 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412670 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.437455 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412727 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413117 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413263 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413700 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414242 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414578 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415218 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415315 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.437709 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415445 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415442 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415732 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420675 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420696 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420735 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.420774 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420918 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.421061 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.421215 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.421311 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.421376 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.422072 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.422197 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.422323 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.423408 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.423597 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.423685 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.423795 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.423788 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.424620 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.424715 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.424941 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.424938 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.425202 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.425303 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.425756 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.425857 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.430189 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.430261 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.430275 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.430507 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.430943 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.430999 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.431137 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.431653 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.431797 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.431836 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.431518 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.431935 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.432250 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408869 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.432396 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.432408 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.433297 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.433301 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.433433 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.433464 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.433505 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.433634 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.434005 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.434514 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.434663 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.434903 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.435599 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.436124 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.436903 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.437087 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.437102 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.437199 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.438587 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.438758 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.438805 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.439345 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.439425 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.439328 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.440483 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.439598 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.440881 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.442517 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.443248 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.444508 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.441704 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.444797 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.446081 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.446339 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.446369 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.446376 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.446597 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448280 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.446307 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.447230 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.447292 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.447583 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.447657 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.447700 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448259 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448774 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448132 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448602 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448599 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448889 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.449041 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.449042 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.449212 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.449429 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.449816 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.449821 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.450970 4781 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.451399 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.452061 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.452463 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:07:10.952413764 +0000 UTC m=+100.209953368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.452542 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:10.952520277 +0000 UTC m=+100.210059871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.452712 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.452877 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453027 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.453145 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:10.953101102 +0000 UTC m=+100.210640736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453223 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453292 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453353 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453412 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453585 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453620 4781 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453693 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453726 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453755 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453782 4781 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453808 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453830 4781 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453850 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453868 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453886 4781 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453905 4781 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453924 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453943 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453961 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453979 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453998 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454015 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454034 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454051 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454070 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454089 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454107 4781 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454126 4781 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454145 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454163 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454180 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454197 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454215 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454233 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454251 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454269 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454287 4781 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454307 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454745 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454998 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.457231 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.458776 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.458823 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.458847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.458879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.458902 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.462307 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.464542 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.468004 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.468901 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.469379 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.472385 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.472508 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.472595 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.472759 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:10.97272514 +0000 UTC m=+100.230264774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.472546 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.473264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454325 4781 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.473804 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.473892 4781 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.473975 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474050 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474132 4781 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474209 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474293 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474366 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474442 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474515 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474593 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474688 4781 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474779 4781 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474880 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474958 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475038 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475113 4781 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475218 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475309 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475389 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475469 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475590 4781 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475694 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475812 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475906 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475999 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476084 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476168 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476245 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476345 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476436 4781 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476511 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476582 4781 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476683 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476771 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476856 4781 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.477364 4781 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.477481 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.477571 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.477682 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.477784 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.477951 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478149 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478272 4781 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478426 4781 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478530 4781 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478669 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478775 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478866 4781 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478943 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479018 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479107 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479199 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479309 4781 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479425 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479504 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479613 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479742 4781 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479929 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480027 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480237 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480348 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480434 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480517 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480601 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480704 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480905 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.481580 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.481740 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.481854 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.481960 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482069 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482158 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482245 4781 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482337 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482412 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482490 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482575 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482681 4781 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482773 4781 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482865 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482958 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483044 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483124 4781 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483210 4781 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483295 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483383 4781 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483467 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483548 4781 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483696 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.481903 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483808 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483971 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484012 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484033 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484057 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484076 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484095 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484114 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484131 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484148 4781 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484166 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484183 4781 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484201 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484219 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484236 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484253 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484270 4781 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.473811 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.483411 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484288 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484427 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.484362 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.484492 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484161 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484223 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.484613 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:10.984574509 +0000 UTC m=+100.242114173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.485869 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.486156 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.486337 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.486457 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.486484 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.486817 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.487097 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.486618 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.487251 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.487519 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.488214 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.488518 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.488583 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.488683 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.488750 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.490243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.491575 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.492697 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.498610 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.499584 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.499700 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.499946 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.499944 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.500009 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.499591 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.501108 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.501304 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.501380 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.502030 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.502399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.506744 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.506821 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.506947 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.506983 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.507212 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.507490 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.507855 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.508083 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.514406 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.525574 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.527834 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.528135 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.544938 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.562478 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.562513 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.562522 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.562538 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.562548 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585475 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585571 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585605 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585713 4781 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585839 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585956 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585997 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586023 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586052 4781 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586065 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586077 4781 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586090 4781 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586102 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586113 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586125 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586136 4781 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586150 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586162 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586174 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586185 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586196 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586207 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586218 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586231 4781 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586244 4781 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586256 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586268 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586279 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586291 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586303 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586315 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586327 4781 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586337 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586348 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586359 4781 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586370 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586382 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586393 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586405 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586417 4781 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586429 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586440 4781 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586451 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586462 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586473 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586485 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586496 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586507 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586518 4781 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586529 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586539 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586550 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586561 4781 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586572 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.644789 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.657773 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.665167 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.665297 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.665387 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.665493 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.665572 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.669426 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.674173 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d60cfc6263e7a31b73237d590cab586ca1c5bb0bd1ff189f6a9548d2a24062bc"} Feb 27 00:07:10 crc kubenswrapper[4781]: W0227 00:07:10.682495 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-b95b44680a9688a92cefef6124c9b7eec60a5565b90b37d2756147ffe3822e5a WatchSource:0}: Error finding container b95b44680a9688a92cefef6124c9b7eec60a5565b90b37d2756147ffe3822e5a: Status 404 returned error can't find the container with id b95b44680a9688a92cefef6124c9b7eec60a5565b90b37d2756147ffe3822e5a Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.768775 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.769070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.769259 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.769507 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.769735 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.871647 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.871776 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.871848 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.871913 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.871972 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.974508 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.974560 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.974600 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.974619 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.974650 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.989981 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.990083 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.990125 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990188 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:07:11.990154026 +0000 UTC m=+101.247693620 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990261 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990281 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990320 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990333 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:11.99031269 +0000 UTC m=+101.247852324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990340 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.990279 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990399 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990410 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:11.990387142 +0000 UTC m=+101.247926736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.990547 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990602 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:11.990580737 +0000 UTC m=+101.248120291 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990622 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.991691 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.991740 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.991796 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:11.99178425 +0000 UTC m=+101.249323804 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.078357 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.078416 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.078427 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.078444 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.078457 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.180722 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.180786 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.180799 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.180816 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.180828 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.282865 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.282916 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.282932 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.282961 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.282976 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.313412 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.313917 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.315078 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.315667 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.316594 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.317068 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.317641 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.318719 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.319428 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.320366 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.321165 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.322205 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.322674 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.323185 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.324027 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.324521 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.325453 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.325889 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.326443 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.326682 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.327418 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.327986 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.328952 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.329349 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.330326 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.330726 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.331319 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.332325 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.332843 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.333887 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.334353 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.335146 4781 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.335243 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.336774 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.337670 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.338127 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.339678 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.340258 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.341132 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.341430 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.341778 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.342778 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.343241 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.344273 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.344944 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.345900 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.346336 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.347205 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.347802 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.348859 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.349340 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.350211 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.350722 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.351692 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.352243 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.352717 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.361137 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.385831 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.385872 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.385884 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.385900 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.385911 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.410015 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.429856 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.440805 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.487754 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.487849 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.487862 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.487882 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.487894 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.589835 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.589879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.589891 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.589906 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.589920 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.679778 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.679851 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.679872 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b95b44680a9688a92cefef6124c9b7eec60a5565b90b37d2756147ffe3822e5a"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.681885 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.683170 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"58a2dc55f42c0314911126d5f58434cc542b34d5467d8896fa32c78ba8af47e7"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.692093 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.692149 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.692172 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.692200 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.692221 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.699928 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.718943 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.736543 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.756755 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.773830 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.792996 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.795564 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.795622 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.795672 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.795697 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.795715 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.815296 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.834338 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.850649 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.863415 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.879197 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.891704 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.898342 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.898395 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.898412 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.898431 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.898443 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.997497 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.997562 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.997583 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.997602 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.997619 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997786 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:07:13.997768913 +0000 UTC m=+103.255308457 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997807 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997791 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997859 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:13.997839165 +0000 UTC m=+103.255378719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997859 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997911 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997954 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997970 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997977 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:13.997947168 +0000 UTC m=+103.255486752 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997870 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.998036 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.998035 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:13.99801208 +0000 UTC m=+103.255551634 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.998100 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:13.998091802 +0000 UTC m=+103.255631356 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.001315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.001376 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.001397 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.001440 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.001464 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.103576 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.103691 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.103715 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.103756 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.103781 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.159644 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-d2xt9"] Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.160101 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.163080 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.163859 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.166910 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.191967 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.199588 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4365da31-2d17-4b58-bb27-bd47b5133a8c-hosts-file\") pod \"node-resolver-d2xt9\" (UID: \"4365da31-2d17-4b58-bb27-bd47b5133a8c\") " pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.199850 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dhwr\" (UniqueName: \"kubernetes.io/projected/4365da31-2d17-4b58-bb27-bd47b5133a8c-kube-api-access-7dhwr\") pod \"node-resolver-d2xt9\" (UID: \"4365da31-2d17-4b58-bb27-bd47b5133a8c\") " pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.206430 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.206490 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.206511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.206539 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.206557 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.217528 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.235940 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.249472 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.262496 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.275302 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.286563 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.300916 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dhwr\" (UniqueName: \"kubernetes.io/projected/4365da31-2d17-4b58-bb27-bd47b5133a8c-kube-api-access-7dhwr\") pod \"node-resolver-d2xt9\" (UID: \"4365da31-2d17-4b58-bb27-bd47b5133a8c\") " pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.300987 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4365da31-2d17-4b58-bb27-bd47b5133a8c-hosts-file\") pod \"node-resolver-d2xt9\" (UID: \"4365da31-2d17-4b58-bb27-bd47b5133a8c\") " pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.301088 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4365da31-2d17-4b58-bb27-bd47b5133a8c-hosts-file\") pod \"node-resolver-d2xt9\" (UID: \"4365da31-2d17-4b58-bb27-bd47b5133a8c\") " pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.308349 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.308376 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.308401 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:12 crc kubenswrapper[4781]: E0227 00:07:12.308492 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:12 crc kubenswrapper[4781]: E0227 00:07:12.308588 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:12 crc kubenswrapper[4781]: E0227 00:07:12.308714 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.308996 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.309041 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.309059 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.309078 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.309098 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.333221 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dhwr\" (UniqueName: \"kubernetes.io/projected/4365da31-2d17-4b58-bb27-bd47b5133a8c-kube-api-access-7dhwr\") pod \"node-resolver-d2xt9\" (UID: \"4365da31-2d17-4b58-bb27-bd47b5133a8c\") " pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.411284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.411339 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.411351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.411370 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.411386 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.486124 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: W0227 00:07:12.502756 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4365da31_2d17_4b58_bb27_bd47b5133a8c.slice/crio-7c906e9e1f1d5d4ed385101cdf2c29ad382c8221dd02b3613e2b41466b7b10c8 WatchSource:0}: Error finding container 7c906e9e1f1d5d4ed385101cdf2c29ad382c8221dd02b3613e2b41466b7b10c8: Status 404 returned error can't find the container with id 7c906e9e1f1d5d4ed385101cdf2c29ad382c8221dd02b3613e2b41466b7b10c8 Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.512909 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.512961 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.512970 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.512989 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.513006 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.550402 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-2k4zf"] Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.551011 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-tlstj"] Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.551249 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.553929 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.558180 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-v6fnj"] Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.559869 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.559988 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.560103 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.560245 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.560276 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.560303 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.560433 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.562914 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.563333 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.563934 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.564197 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.564362 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.564466 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.575870 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.587885 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.597540 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.608875 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49nxx\" (UniqueName: \"kubernetes.io/projected/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-kube-api-access-49nxx\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.608927 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32c19e2e-0830-47a5-9ea8-862e1c9d8571-proxy-tls\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.608954 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-cni-bin\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.608998 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-cnibin\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609069 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-system-cni-dir\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609150 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-multus-certs\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609200 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609221 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/32c19e2e-0830-47a5-9ea8-862e1c9d8571-rootfs\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609301 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-system-cni-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609409 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-os-release\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609517 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-daemon-config\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609561 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609603 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj8st\" (UniqueName: \"kubernetes.io/projected/32c19e2e-0830-47a5-9ea8-862e1c9d8571-kube-api-access-qj8st\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609690 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cnibin\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609724 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cni-binary-copy\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609752 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-cni-binary-copy\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609781 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-netns\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609811 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98vj\" (UniqueName: \"kubernetes.io/projected/2f348e07-ea87-45b6-8f2b-6e1b08eda780-kube-api-access-x98vj\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609842 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-hostroot\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609899 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-conf-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609933 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-socket-dir-parent\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609965 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-kubelet\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609993 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-etc-kubernetes\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.610026 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-cni-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.610058 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-k8s-cni-cncf-io\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.610084 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/32c19e2e-0830-47a5-9ea8-862e1c9d8571-mcd-auth-proxy-config\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.610128 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-cni-multus\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.610157 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-os-release\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.613950 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.615797 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.615842 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.615856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.615874 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.615886 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.632362 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.648260 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.665563 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.676989 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.687779 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d2xt9" event={"ID":"4365da31-2d17-4b58-bb27-bd47b5133a8c","Type":"ContainerStarted","Data":"7c906e9e1f1d5d4ed385101cdf2c29ad382c8221dd02b3613e2b41466b7b10c8"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.688556 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.700063 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711056 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-cni-bin\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711108 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-cnibin\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711136 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-system-cni-dir\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711158 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-multus-certs\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711179 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711201 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/32c19e2e-0830-47a5-9ea8-862e1c9d8571-rootfs\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711228 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-cni-bin\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711242 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-system-cni-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711299 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-system-cni-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711330 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-os-release\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711347 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-cnibin\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711370 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-system-cni-dir\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711372 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-daemon-config\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711391 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-multus-certs\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711422 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711456 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj8st\" (UniqueName: \"kubernetes.io/projected/32c19e2e-0830-47a5-9ea8-862e1c9d8571-kube-api-access-qj8st\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711507 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cnibin\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711538 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cni-binary-copy\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711571 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-cni-binary-copy\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711598 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-netns\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711652 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x98vj\" (UniqueName: \"kubernetes.io/projected/2f348e07-ea87-45b6-8f2b-6e1b08eda780-kube-api-access-x98vj\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711681 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-hostroot\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711715 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-conf-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711745 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-socket-dir-parent\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711775 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-kubelet\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711803 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-etc-kubernetes\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711837 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-cni-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711866 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-k8s-cni-cncf-io\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711896 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/32c19e2e-0830-47a5-9ea8-862e1c9d8571-mcd-auth-proxy-config\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711947 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-cni-multus\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711976 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-os-release\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712006 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49nxx\" (UniqueName: \"kubernetes.io/projected/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-kube-api-access-49nxx\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712036 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32c19e2e-0830-47a5-9ea8-862e1c9d8571-proxy-tls\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712049 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712077 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/32c19e2e-0830-47a5-9ea8-862e1c9d8571-rootfs\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711461 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-os-release\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712552 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-daemon-config\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712669 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-k8s-cni-cncf-io\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712673 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-kubelet\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712737 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-socket-dir-parent\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712740 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-etc-kubernetes\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712776 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-netns\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712901 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-cni-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712965 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cni-binary-copy\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712993 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cnibin\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713016 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-cni-multus\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713065 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-hostroot\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713118 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-conf-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713154 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-os-release\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713686 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/32c19e2e-0830-47a5-9ea8-862e1c9d8571-mcd-auth-proxy-config\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713856 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-cni-binary-copy\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713877 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.716265 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32c19e2e-0830-47a5-9ea8-862e1c9d8571-proxy-tls\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.724162 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.725393 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.725488 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.725509 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.725532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.725550 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.735105 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98vj\" (UniqueName: \"kubernetes.io/projected/2f348e07-ea87-45b6-8f2b-6e1b08eda780-kube-api-access-x98vj\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.737854 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49nxx\" (UniqueName: \"kubernetes.io/projected/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-kube-api-access-49nxx\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.740421 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.743832 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj8st\" (UniqueName: \"kubernetes.io/projected/32c19e2e-0830-47a5-9ea8-862e1c9d8571-kube-api-access-qj8st\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.759064 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.773648 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.789864 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.808937 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.820795 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.827998 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.828037 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.828048 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.828063 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.828072 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.829158 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.841931 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.874833 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: W0227 00:07:12.885957 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a6dd1e0_45ab_46f0_b298_d89e47aaeecb.slice/crio-afca0776f8332a8c4a92e2364a60b965769223e7fdd2984f8a337a5359abdfae WatchSource:0}: Error finding container afca0776f8332a8c4a92e2364a60b965769223e7fdd2984f8a337a5359abdfae: Status 404 returned error can't find the container with id afca0776f8332a8c4a92e2364a60b965769223e7fdd2984f8a337a5359abdfae Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.886772 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.894447 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: W0227 00:07:12.894816 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f348e07_ea87_45b6_8f2b_6e1b08eda780.slice/crio-126f52a357f307cfd5409436394d46cd4bf8f894039fb1caf98e2be20ed3ac05 WatchSource:0}: Error finding container 126f52a357f307cfd5409436394d46cd4bf8f894039fb1caf98e2be20ed3ac05: Status 404 returned error can't find the container with id 126f52a357f307cfd5409436394d46cd4bf8f894039fb1caf98e2be20ed3ac05 Feb 27 00:07:12 crc kubenswrapper[4781]: W0227 00:07:12.908140 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c19e2e_0830_47a5_9ea8_862e1c9d8571.slice/crio-49b7bfb8be1ec4f939d5551164b703478eafd97057bba8eaacc08c4bdae4f0a5 WatchSource:0}: Error finding container 49b7bfb8be1ec4f939d5551164b703478eafd97057bba8eaacc08c4bdae4f0a5: Status 404 returned error can't find the container with id 49b7bfb8be1ec4f939d5551164b703478eafd97057bba8eaacc08c4bdae4f0a5 Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.927918 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d2zn6"] Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.928769 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.931780 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.932006 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.932127 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.932278 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.932417 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.932533 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.932637 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.933280 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.933305 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.933314 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.933327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.933522 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.950539 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.965008 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.978700 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.991646 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.004592 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014029 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-node-log\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014069 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-systemd\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014088 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-log-socket\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014103 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-bin\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014117 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-config\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014130 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-kubelet\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014143 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-slash\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014159 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-env-overrides\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014173 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014189 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014213 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-script-lib\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014229 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovn-node-metrics-cert\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014243 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5qlg\" (UniqueName: \"kubernetes.io/projected/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-kube-api-access-r5qlg\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014257 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-netd\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014272 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-var-lib-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014284 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-ovn\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014300 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-systemd-units\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014314 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-ovn-kubernetes\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014328 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-netns\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014404 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-etc-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.017077 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.048956 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.052863 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.052907 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.052919 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.052938 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.052950 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.076869 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.090905 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.102269 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.112571 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115025 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovn-node-metrics-cert\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115050 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5qlg\" (UniqueName: \"kubernetes.io/projected/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-kube-api-access-r5qlg\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115068 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-var-lib-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115083 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-ovn\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-netd\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115116 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-systemd-units\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115132 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-ovn-kubernetes\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-netns\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115181 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-etc-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115197 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-node-log\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115217 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-systemd\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115231 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-log-socket\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115256 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-bin\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115270 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-config\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115283 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-kubelet\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115296 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-slash\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115310 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-env-overrides\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115326 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115347 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-script-lib\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115930 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-script-lib\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116439 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-node-log\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116546 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-systemd-units\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116597 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-ovn-kubernetes\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116648 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-netd\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116652 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-kubelet\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116562 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-netns\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116638 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-etc-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116673 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-var-lib-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116688 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-log-socket\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116692 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-bin\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116700 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116726 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-slash\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116742 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-systemd\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116748 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.117091 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-env-overrides\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.117096 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-config\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.117187 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-ovn\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.122189 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovn-node-metrics-cert\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.129773 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5qlg\" (UniqueName: \"kubernetes.io/projected/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-kube-api-access-r5qlg\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.154783 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.154815 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.154824 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.154837 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.154846 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.245085 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.257354 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.257420 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.257434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.257451 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.257496 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: W0227 00:07:13.257697 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12a87c22_b4e1_4aa9_8b3e_a34f7d159239.slice/crio-96af27195e73c8a72996dd4d8221316b5eec9c31c92a51b4fb0d127265c1c59f WatchSource:0}: Error finding container 96af27195e73c8a72996dd4d8221316b5eec9c31c92a51b4fb0d127265c1c59f: Status 404 returned error can't find the container with id 96af27195e73c8a72996dd4d8221316b5eec9c31c92a51b4fb0d127265c1c59f Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.360269 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.360304 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.360314 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.360327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.360337 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.463008 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.463063 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.463076 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.463094 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.463116 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.569899 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.569966 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.569985 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.570011 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.570031 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.672496 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.672558 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.672577 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.672599 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.672617 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.693494 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerStarted","Data":"3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.693545 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerStarted","Data":"afca0776f8332a8c4a92e2364a60b965769223e7fdd2984f8a337a5359abdfae"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.694938 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" exitCode=0 Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.695002 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.695032 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"96af27195e73c8a72996dd4d8221316b5eec9c31c92a51b4fb0d127265c1c59f"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.698179 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.698231 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.698247 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"49b7bfb8be1ec4f939d5551164b703478eafd97057bba8eaacc08c4bdae4f0a5"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.700402 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.702024 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d2xt9" event={"ID":"4365da31-2d17-4b58-bb27-bd47b5133a8c","Type":"ContainerStarted","Data":"a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.704677 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f348e07-ea87-45b6-8f2b-6e1b08eda780" containerID="e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8" exitCode=0 Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.704731 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerDied","Data":"e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.704761 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerStarted","Data":"126f52a357f307cfd5409436394d46cd4bf8f894039fb1caf98e2be20ed3ac05"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.712211 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.734234 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.751185 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.777377 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.777431 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.777455 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.777474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.777486 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.779770 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.795291 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.813538 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.829258 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.845208 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.872788 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.879883 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.879926 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.879936 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.879951 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.879962 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.928964 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.944952 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.963172 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.981328 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.982458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.982506 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.982525 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.982552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.982571 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.992297 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.020028 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.023107 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.023207 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023245 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:07:18.023227666 +0000 UTC m=+107.280767220 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.023272 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023300 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.023313 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023367 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:18.023346949 +0000 UTC m=+107.280886573 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.023396 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023425 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023425 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023439 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023445 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023448 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023454 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023475 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:18.023469222 +0000 UTC m=+107.281008766 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023486 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:18.023481213 +0000 UTC m=+107.281020767 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023513 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023557 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:18.023541464 +0000 UTC m=+107.281081128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.033197 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.049293 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.065363 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.084683 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.084710 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.084719 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.084730 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.084740 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.085384 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.095817 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.106018 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.118079 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.186793 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.186819 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.186826 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.186838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.186848 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.289979 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.290030 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.290052 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.290078 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.290098 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.308257 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.308398 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.309107 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.309190 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.309292 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.309371 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.399858 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.400195 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.400212 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.400235 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.400251 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.502465 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.502525 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.502543 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.502569 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.502587 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.605901 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.605963 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.605982 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.606056 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.606079 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.713808 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.714099 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.714111 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.714125 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.714136 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.716457 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.716653 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.716756 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.716842 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.718993 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerStarted","Data":"3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.733249 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.747947 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.761910 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.778012 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.790973 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.810506 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.817659 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.817703 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.817716 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.817733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.817746 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.821311 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.834597 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.843605 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.854357 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.868288 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.920195 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.920523 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.920656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.920751 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.920902 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.024389 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.024701 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.024836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.025004 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.025147 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.128451 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.128932 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.129078 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.129221 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.129377 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.231973 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.232315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.232483 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.232663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.232801 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.334937 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.334992 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.335012 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.335031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.335045 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.437288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.437508 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.437570 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.437649 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.437717 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.539704 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.539939 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.540111 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.540195 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.540284 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.642621 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.643854 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.644009 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.644201 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.644360 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.727469 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.727532 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.730473 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f348e07-ea87-45b6-8f2b-6e1b08eda780" containerID="3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca" exitCode=0 Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.730520 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerDied","Data":"3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.748288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.748332 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.748348 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.748371 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.748387 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.763081 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.785966 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.801930 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.814402 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.840866 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.851330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.851358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.851366 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.851378 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.851387 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.860904 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.880231 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.895105 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.912952 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.924788 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.945415 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.954175 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.954219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.954236 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.954257 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.954274 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.056729 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.056790 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.056809 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.056832 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.056849 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.159229 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.159292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.159310 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.159336 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.159354 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.263345 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.263385 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.263400 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.263421 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.263436 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.309003 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.309056 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.309188 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:16 crc kubenswrapper[4781]: E0227 00:07:16.309160 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:16 crc kubenswrapper[4781]: E0227 00:07:16.309340 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:16 crc kubenswrapper[4781]: E0227 00:07:16.309472 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.366302 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.366340 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.366355 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.366372 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.366384 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.468594 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.468649 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.468658 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.468675 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.468685 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.571485 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.571529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.571542 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.571585 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.571600 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.674500 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.674528 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.674539 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.674554 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.674565 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.743935 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerDied","Data":"6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.744074 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f348e07-ea87-45b6-8f2b-6e1b08eda780" containerID="6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d" exitCode=0 Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.762245 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.778791 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.778843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.778857 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.778875 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.778887 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.779763 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.808102 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.826673 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.842156 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.858886 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.875612 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.880970 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.880995 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.881003 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.881016 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.881026 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.897174 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.918374 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.933313 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.948318 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.986883 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.986920 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.986929 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.986943 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.986951 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.089163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.089194 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.089205 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.089221 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.089233 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.191375 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.191427 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.191439 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.191455 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.191467 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.294066 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.294091 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.294099 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.294112 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.294120 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.397522 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.397579 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.397600 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.397676 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.397701 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.500205 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.500264 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.500280 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.500301 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.500316 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.603285 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.603345 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.603419 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.603450 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.603470 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.706301 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.706409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.706435 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.706466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.706487 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.759038 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.762841 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f348e07-ea87-45b6-8f2b-6e1b08eda780" containerID="6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e" exitCode=0 Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.762899 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerDied","Data":"6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.783027 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.797601 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.809217 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.809257 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.809269 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.809286 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.809297 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.810428 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.830487 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.845772 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.859260 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.886971 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.903164 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.912816 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.912843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.912852 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.912866 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.912876 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.916076 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.937440 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.960609 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.014351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.014398 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.014411 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.014429 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.014441 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.062743 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.062824 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.062846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.062867 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.062943 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:07:26.062908657 +0000 UTC m=+115.320448221 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.062956 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.062985 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.062999 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063021 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063038 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063045 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063071 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:26.063049331 +0000 UTC m=+115.320588905 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063085 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063133 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063145 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063094 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:26.063083501 +0000 UTC m=+115.320623075 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063214 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:26.063194994 +0000 UTC m=+115.320734548 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063237 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:26.063231015 +0000 UTC m=+115.320770559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.117100 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.117148 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.117163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.117181 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.117191 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.219859 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.219913 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.219927 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.219944 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.219958 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.309161 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.309224 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.309355 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.309293 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.309478 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.309620 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.322240 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.322296 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.322313 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.322336 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.322355 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.425576 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.425643 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.425657 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.425672 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.425683 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.528920 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.528968 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.528981 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.528999 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.529011 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.633554 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.633806 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.633822 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.633843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.633860 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.736265 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.736339 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.736358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.736381 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.736398 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.769210 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f348e07-ea87-45b6-8f2b-6e1b08eda780" containerID="08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd" exitCode=0 Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.769250 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerDied","Data":"08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.793846 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.812849 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.833030 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.839044 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.839091 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.839109 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.839133 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.839151 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.854853 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.871193 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.894290 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.911471 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.911544 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.911567 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.911596 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.911619 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.929085 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.932071 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.936582 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.936619 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.936646 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.936663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.936675 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.946615 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.948134 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.951884 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.951938 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.951958 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.951986 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.952006 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.960091 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.971756 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.974285 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.975686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.975779 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.975824 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.975843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.975856 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.992530 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.995425 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.996698 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.996748 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.996761 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.996779 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.996795 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: E0227 00:07:19.012828 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: E0227 00:07:19.012980 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.014563 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.014590 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.014604 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.014641 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.014656 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.117365 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.117434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.117460 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.117494 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.117520 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.223030 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.223325 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.223340 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.223357 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.223371 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.233096 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-rc856"] Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.233493 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.240938 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.243176 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.244150 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.244159 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.263437 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.283278 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.301935 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.321686 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.327234 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.327295 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.327318 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.327347 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.327433 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.335062 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.354821 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.376171 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq5gj\" (UniqueName: \"kubernetes.io/projected/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-kube-api-access-rq5gj\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.376226 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-serviceca\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.376284 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-host\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.376671 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.390723 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.402029 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.421728 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.431553 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.431592 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.431603 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.431651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.431665 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.435728 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.447858 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.477230 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq5gj\" (UniqueName: \"kubernetes.io/projected/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-kube-api-access-rq5gj\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.477270 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-serviceca\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.477309 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-host\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.477422 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-host\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.481985 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-serviceca\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.496872 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq5gj\" (UniqueName: \"kubernetes.io/projected/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-kube-api-access-rq5gj\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.533617 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.533694 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.533710 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.533731 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.533747 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.572077 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: W0227 00:07:19.591420 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda60df0eb_b7b5_4b83_8d09_43fcd7c63ab2.slice/crio-254ee4af5136084e6fbb4d938d96a18bf59daaddcc4f6e83208848cf5ed556ff WatchSource:0}: Error finding container 254ee4af5136084e6fbb4d938d96a18bf59daaddcc4f6e83208848cf5ed556ff: Status 404 returned error can't find the container with id 254ee4af5136084e6fbb4d938d96a18bf59daaddcc4f6e83208848cf5ed556ff Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.636790 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.636838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.636856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.636880 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.636899 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.739516 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.739573 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.739597 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.739705 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.739734 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.773151 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rc856" event={"ID":"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2","Type":"ContainerStarted","Data":"254ee4af5136084e6fbb4d938d96a18bf59daaddcc4f6e83208848cf5ed556ff"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.782362 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.782412 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.782428 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.782441 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.794470 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f348e07-ea87-45b6-8f2b-6e1b08eda780" containerID="74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245" exitCode=0 Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.794504 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerDied","Data":"74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.800840 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.815618 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.833971 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.841397 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.841441 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.841456 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.841477 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.841494 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.842880 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.843392 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.849083 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.861114 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.875171 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.889787 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.907698 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.920082 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.940499 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.944780 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.944832 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.944845 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.944864 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.944880 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.953389 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.967872 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.982480 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.995850 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.007992 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.033933 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.046761 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.047333 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.047351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.047358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.047370 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.047378 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.056702 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.071712 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.087815 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.100332 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.113275 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.126742 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.143200 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.149709 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.149738 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.149750 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.149765 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.149776 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.252219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.252267 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.252278 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.252295 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.252308 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.308902 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.308961 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.308977 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:20 crc kubenswrapper[4781]: E0227 00:07:20.309058 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:20 crc kubenswrapper[4781]: E0227 00:07:20.309166 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:20 crc kubenswrapper[4781]: E0227 00:07:20.309463 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.325446 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.326145 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.359818 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.359876 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.359897 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.359923 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.359943 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.462592 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.462659 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.462673 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.462690 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.462704 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.565330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.565367 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.565377 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.565391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.565400 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.668039 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.668092 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.668112 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.668135 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.668151 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.770603 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.770664 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.770677 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.770693 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.770706 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.799243 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rc856" event={"ID":"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2","Type":"ContainerStarted","Data":"e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.801745 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.803957 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.804555 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.816273 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerStarted","Data":"606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.827057 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.844005 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.855558 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.872749 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.874060 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.874212 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.874350 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.874474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.874589 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.885111 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.897126 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.917512 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.929665 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.952228 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.971933 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.979101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.979191 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.979217 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.979281 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.979308 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.987112 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.001458 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.013691 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.025321 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.040432 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.057834 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.079299 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.088585 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.088670 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.088689 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.088709 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.088724 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.096968 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.112482 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.123251 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.134271 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.145505 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.165885 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.179242 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.192455 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.192529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.192547 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.192567 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.192581 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.201443 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.213914 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.295024 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.295082 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.295092 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.295107 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.295148 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.330304 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.335040 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.354089 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.367765 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.399300 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.399358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.399369 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.399383 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.399394 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.432039 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.463074 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.474988 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.488862 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.505445 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.505836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.505927 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.506013 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.506126 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.507798 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.521089 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.532193 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.541050 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.550160 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.559765 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.609073 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.609289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.609348 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.609404 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.609496 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.712192 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.712224 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.712233 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.712247 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.712257 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.814932 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.814974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.814986 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.815003 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.815014 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.918265 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.918310 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.918322 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.918340 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.918352 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.020350 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.020409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.020421 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.020436 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.020450 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.123239 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.123284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.123299 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.123315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.123327 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.226049 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.226137 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.226161 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.226190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.226215 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.309215 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.309287 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.309346 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:22 crc kubenswrapper[4781]: E0227 00:07:22.309405 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:22 crc kubenswrapper[4781]: E0227 00:07:22.309515 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:22 crc kubenswrapper[4781]: E0227 00:07:22.309691 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.328614 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.328668 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.328679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.328696 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.328708 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.431768 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.431839 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.431861 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.431888 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.431907 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.533922 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.533982 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.534005 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.534036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.534058 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.636976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.637043 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.637061 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.637086 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.637104 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.740219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.740292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.740317 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.740348 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.740370 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.824908 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/0.log" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.829335 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78" exitCode=1 Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.829399 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.830508 4781 scope.go:117] "RemoveContainer" containerID="cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.843787 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.843843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.843862 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.843887 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.843906 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.862002 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.886604 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.906998 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.927868 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.949535 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.951514 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.951535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.951544 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.951558 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.951567 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.968378 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.003507 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:22Z\\\",\\\"message\\\":\\\"9 6566 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.358987 6566 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359235 6566 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359580 6566 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 00:07:22.359758 6566 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:22.360687 6566 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:22.360715 6566 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:22.360738 6566 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:22.360814 6566 factory.go:656] Stopping watch factory\\\\nI0227 00:07:22.360840 6566 ovnkube.go:599] Stopped ovnkube\\\\nI0227 00:07:22.360849 6566 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:22.360873 6566 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.036405 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.054680 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.054733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.054751 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.054775 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.054792 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.054904 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.086624 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.101957 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.120467 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.140702 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.154677 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.157377 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.157452 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.157471 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.157497 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.157515 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.259723 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.259783 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.259803 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.259827 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.259843 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.362751 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.362837 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.362860 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.362892 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.362921 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.465196 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.465269 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.465288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.465315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.465332 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.567277 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.567319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.567332 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.567350 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.567364 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.669783 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.669826 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.669837 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.669854 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.669864 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.772503 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.772539 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.772548 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.772566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.772575 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.285424 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.285479 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.285496 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.285519 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.285538 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.289263 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/0.log" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.293994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.294518 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.308570 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.308587 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:24 crc kubenswrapper[4781]: E0227 00:07:24.308869 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.308595 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:24 crc kubenswrapper[4781]: E0227 00:07:24.309052 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:24 crc kubenswrapper[4781]: E0227 00:07:24.309261 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.315523 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.333473 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.350255 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.369550 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.385198 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.391579 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.391651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.391666 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.391686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.391703 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.408431 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.435359 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:22Z\\\",\\\"message\\\":\\\"9 6566 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.358987 6566 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359235 6566 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359580 6566 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 00:07:22.359758 6566 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:22.360687 6566 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:22.360715 6566 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:22.360738 6566 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:22.360814 6566 factory.go:656] Stopping watch factory\\\\nI0227 00:07:22.360840 6566 ovnkube.go:599] Stopped ovnkube\\\\nI0227 00:07:22.360849 6566 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:22.360873 6566 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.449802 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.465429 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.488777 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.494214 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.494251 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.494262 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.494278 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.494289 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.505835 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.524776 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.543949 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.559418 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.597476 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.597841 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.598027 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.598219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.598402 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.702059 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.702147 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.702176 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.702208 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.702225 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.806016 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.806070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.806088 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.806114 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.806131 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.909822 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.909885 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.909902 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.909964 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.909985 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.013282 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.013369 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.013393 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.013421 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.013440 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.116190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.116253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.116271 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.116296 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.116314 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.219289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.219378 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.219402 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.219434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.219462 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.300364 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/1.log" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.301533 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/0.log" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.306891 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195" exitCode=1 Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.306973 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.307052 4781 scope.go:117] "RemoveContainer" containerID="cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.308086 4781 scope.go:117] "RemoveContainer" containerID="5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195" Feb 27 00:07:25 crc kubenswrapper[4781]: E0227 00:07:25.308390 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.322434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.322474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.322483 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.322496 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.322505 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.335098 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.346939 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s"] Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.347668 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.350460 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.350462 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.361136 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.381250 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.399349 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.416735 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.425427 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.425474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.425510 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.425552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.425573 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.433367 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.454597 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/929a21d9-47cd-44cc-b211-258202a86076-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.454661 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v57b4\" (UniqueName: \"kubernetes.io/projected/929a21d9-47cd-44cc-b211-258202a86076-kube-api-access-v57b4\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.454715 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/929a21d9-47cd-44cc-b211-258202a86076-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.454762 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/929a21d9-47cd-44cc-b211-258202a86076-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.465709 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:22Z\\\",\\\"message\\\":\\\"9 6566 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.358987 6566 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359235 6566 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359580 6566 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 00:07:22.359758 6566 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:22.360687 6566 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:22.360715 6566 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:22.360738 6566 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:22.360814 6566 factory.go:656] Stopping watch factory\\\\nI0227 00:07:22.360840 6566 ovnkube.go:599] Stopped ovnkube\\\\nI0227 00:07:22.360849 6566 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:22.360873 6566 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.485426 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.501966 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.528531 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.528755 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.528839 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.528955 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.529041 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.534460 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.555664 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/929a21d9-47cd-44cc-b211-258202a86076-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.555883 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v57b4\" (UniqueName: \"kubernetes.io/projected/929a21d9-47cd-44cc-b211-258202a86076-kube-api-access-v57b4\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.556054 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/929a21d9-47cd-44cc-b211-258202a86076-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.556849 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/929a21d9-47cd-44cc-b211-258202a86076-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.556773 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/929a21d9-47cd-44cc-b211-258202a86076-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.557052 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/929a21d9-47cd-44cc-b211-258202a86076-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.558424 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.574578 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/929a21d9-47cd-44cc-b211-258202a86076-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.588444 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v57b4\" (UniqueName: \"kubernetes.io/projected/929a21d9-47cd-44cc-b211-258202a86076-kube-api-access-v57b4\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.588735 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.613908 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.629713 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.630908 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.630967 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.630990 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.631019 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.631041 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.645950 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.670487 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.672319 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: W0227 00:07:25.690885 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod929a21d9_47cd_44cc_b211_258202a86076.slice/crio-63eac76b570756aaa554c96657c8ea62bf9ba1f65af88dfe4cac9c28439e8107 WatchSource:0}: Error finding container 63eac76b570756aaa554c96657c8ea62bf9ba1f65af88dfe4cac9c28439e8107: Status 404 returned error can't find the container with id 63eac76b570756aaa554c96657c8ea62bf9ba1f65af88dfe4cac9c28439e8107 Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.694871 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.718143 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.734162 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.734214 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.734232 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.734256 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.734273 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.740975 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.758545 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.781972 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:22Z\\\",\\\"message\\\":\\\"9 6566 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.358987 6566 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359235 6566 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359580 6566 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 00:07:22.359758 6566 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:22.360687 6566 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:22.360715 6566 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:22.360738 6566 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:22.360814 6566 factory.go:656] Stopping watch factory\\\\nI0227 00:07:22.360840 6566 ovnkube.go:599] Stopped ovnkube\\\\nI0227 00:07:22.360849 6566 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:22.360873 6566 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.801654 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.820235 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.837434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.837503 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.837529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.837561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.837589 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.841929 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.866136 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.882599 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.903100 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.936375 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.941863 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.941921 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.941938 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.941964 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.941982 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.960151 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.044946 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.045013 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.045034 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.045064 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.045085 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.063174 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.063415 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063449 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:07:42.063405548 +0000 UTC m=+131.320945142 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.063511 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.063582 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063655 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063693 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063708 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063765 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:42.063745397 +0000 UTC m=+131.321284961 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063767 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.063670 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063837 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063854 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:42.063832799 +0000 UTC m=+131.321372453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063854 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.064017 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:42.063981213 +0000 UTC m=+131.321520817 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.064053 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.064082 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.064152 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:42.064128127 +0000 UTC m=+131.321667801 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.100448 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-kpnjj"] Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.100890 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.100945 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.122469 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.137658 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.147118 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.147171 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.147189 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.147212 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.147229 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.153853 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.186152 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.200752 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.218951 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.240858 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.249430 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.249733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.249817 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.249906 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.249976 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.263690 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.266091 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.266177 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db9s2\" (UniqueName: \"kubernetes.io/projected/e866e388-01ab-407a-a59b-d0ba6c3f6f22-kube-api-access-db9s2\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.281260 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.294008 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.305010 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.308235 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.308251 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.308485 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.308603 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.308859 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.308940 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.311991 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" event={"ID":"929a21d9-47cd-44cc-b211-258202a86076","Type":"ContainerStarted","Data":"8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.312034 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" event={"ID":"929a21d9-47cd-44cc-b211-258202a86076","Type":"ContainerStarted","Data":"3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.312048 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" event={"ID":"929a21d9-47cd-44cc-b211-258202a86076","Type":"ContainerStarted","Data":"63eac76b570756aaa554c96657c8ea62bf9ba1f65af88dfe4cac9c28439e8107"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.315486 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/1.log" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.324307 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.325922 4781 scope.go:117] "RemoveContainer" containerID="5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.326090 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.349909 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:22Z\\\",\\\"message\\\":\\\"9 6566 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.358987 6566 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359235 6566 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359580 6566 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 00:07:22.359758 6566 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:22.360687 6566 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:22.360715 6566 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:22.360738 6566 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:22.360814 6566 factory.go:656] Stopping watch factory\\\\nI0227 00:07:22.360840 6566 ovnkube.go:599] Stopped ovnkube\\\\nI0227 00:07:22.360849 6566 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:22.360873 6566 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.352177 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.352242 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.352261 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.352289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.352309 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.364076 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.366810 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.366885 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db9s2\" (UniqueName: \"kubernetes.io/projected/e866e388-01ab-407a-a59b-d0ba6c3f6f22-kube-api-access-db9s2\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.366977 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.367039 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:26.867022523 +0000 UTC m=+116.124562077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.377698 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.384538 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db9s2\" (UniqueName: \"kubernetes.io/projected/e866e388-01ab-407a-a59b-d0ba6c3f6f22-kube-api-access-db9s2\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.387938 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.401447 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.416613 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.425475 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.440647 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.455026 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.455077 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.455096 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.455120 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.455139 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.456658 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.469449 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.490688 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.502475 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.516721 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.528544 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.542543 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.554977 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.559116 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.559150 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.559163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.559181 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.559194 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.575556 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.592852 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.607797 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.621770 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.661940 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.662017 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.662039 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.662062 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.662113 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.765044 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.765481 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.765546 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.765606 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.765695 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.869370 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.869425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.869441 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.869466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.869482 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.872285 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.872452 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.872530 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:27.872508578 +0000 UTC m=+117.130048172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.972307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.972354 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.972372 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.972394 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.972411 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.075484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.075532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.075549 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.075573 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.075589 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.178155 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.178204 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.178216 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.178234 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.178246 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.281315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.281351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.281362 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.281380 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.281392 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.384036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.384071 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.384079 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.384092 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.384101 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.487064 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.487160 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.487187 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.487219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.487242 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.591131 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.591182 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.591200 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.591223 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.591247 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.695683 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.695741 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.695763 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.695790 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.695810 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.799196 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.799259 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.799277 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.799326 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.799345 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.887184 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:27 crc kubenswrapper[4781]: E0227 00:07:27.887435 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:27 crc kubenswrapper[4781]: E0227 00:07:27.887535 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:29.887509795 +0000 UTC m=+119.145049379 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.902596 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.902670 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.902688 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.902711 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.902728 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.005562 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.005665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.005692 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.005721 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.005746 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.108541 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.108677 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.108706 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.108736 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.108761 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.211854 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.211918 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.211937 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.211963 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.211982 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.308948 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.308995 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.309021 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.308996 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:28 crc kubenswrapper[4781]: E0227 00:07:28.309111 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:28 crc kubenswrapper[4781]: E0227 00:07:28.309452 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:28 crc kubenswrapper[4781]: E0227 00:07:28.309662 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:28 crc kubenswrapper[4781]: E0227 00:07:28.309761 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.316427 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.316491 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.316515 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.316544 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.316566 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.419153 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.419209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.419229 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.419254 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.419272 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.522558 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.522684 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.522710 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.522740 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.522762 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.625974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.626143 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.626213 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.626253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.626325 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.729729 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.729804 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.729823 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.729847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.729869 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.833125 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.833200 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.833218 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.833244 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.833264 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.935974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.936035 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.936057 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.936086 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.936107 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.039200 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.039276 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.039302 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.039331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.039352 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.143391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.143463 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.143484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.143511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.143532 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.247288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.247361 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.247387 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.247413 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.247431 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.350316 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.350399 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.350425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.350459 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.350483 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.383732 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.383784 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.383802 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.383831 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.383851 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.407773 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.413568 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.413624 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.413667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.413691 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.413709 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.434589 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.440176 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.440290 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.440348 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.440376 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.440425 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.462277 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.468289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.468330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.468341 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.468357 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.468369 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.487376 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.492564 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.492665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.492685 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.492711 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.492733 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.513056 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.513228 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.515621 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.515697 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.515714 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.515832 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.515854 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.618686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.618724 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.618735 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.618751 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.618763 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.720732 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.720780 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.720800 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.720826 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.720847 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.823521 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.823566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.823584 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.823606 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.823653 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.913034 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.913204 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.913300 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:33.913270274 +0000 UTC m=+123.170809878 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.926937 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.926997 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.927019 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.927048 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.927069 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.030248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.030293 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.030302 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.030317 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.030329 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.133428 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.133505 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.133527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.133561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.133584 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.236572 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.236683 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.236709 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.236738 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.236762 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.308698 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.308767 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.308868 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:30 crc kubenswrapper[4781]: E0227 00:07:30.308868 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.308704 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:30 crc kubenswrapper[4781]: E0227 00:07:30.309218 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:30 crc kubenswrapper[4781]: E0227 00:07:30.309311 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:30 crc kubenswrapper[4781]: E0227 00:07:30.309097 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.339129 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.339181 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.339198 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.339221 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.339242 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.441190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.441236 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.441253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.441277 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.441296 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.544311 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.544375 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.544395 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.544423 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.544442 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.647287 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.647351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.647368 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.647394 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.647413 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.750567 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.750659 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.750679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.750702 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.750720 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.853566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.853623 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.853664 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.853687 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.853704 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.956158 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.956219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.956240 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.956272 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.956294 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.058974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.059042 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.059069 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.059097 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.059118 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:31Z","lastTransitionTime":"2026-02-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.162107 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.162178 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.162202 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.162232 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.162259 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:31Z","lastTransitionTime":"2026-02-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:31 crc kubenswrapper[4781]: E0227 00:07:31.262570 4781 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.342088 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.363616 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.386771 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.412775 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: E0227 00:07:31.419014 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.429755 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.444565 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.462464 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.475973 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.489992 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.518248 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.563672 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.573229 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.592990 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.604874 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.615719 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.625127 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:32 crc kubenswrapper[4781]: I0227 00:07:32.308998 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:32 crc kubenswrapper[4781]: I0227 00:07:32.309010 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:32 crc kubenswrapper[4781]: E0227 00:07:32.309187 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:32 crc kubenswrapper[4781]: I0227 00:07:32.309029 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:32 crc kubenswrapper[4781]: E0227 00:07:32.309458 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:32 crc kubenswrapper[4781]: I0227 00:07:32.309474 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:32 crc kubenswrapper[4781]: E0227 00:07:32.309552 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:32 crc kubenswrapper[4781]: E0227 00:07:32.310571 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.288914 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.310788 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.330387 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.345006 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.375595 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.396940 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.416583 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.432881 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.458030 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.474400 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.492457 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.526373 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.548322 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.569489 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.591361 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.610518 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.628172 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.954901 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:33 crc kubenswrapper[4781]: E0227 00:07:33.955068 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:33 crc kubenswrapper[4781]: E0227 00:07:33.955164 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:41.955141254 +0000 UTC m=+131.212680838 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:34 crc kubenswrapper[4781]: I0227 00:07:34.308790 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:34 crc kubenswrapper[4781]: I0227 00:07:34.308892 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:34 crc kubenswrapper[4781]: I0227 00:07:34.308915 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:34 crc kubenswrapper[4781]: I0227 00:07:34.308914 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:34 crc kubenswrapper[4781]: E0227 00:07:34.309056 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:34 crc kubenswrapper[4781]: E0227 00:07:34.309294 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:34 crc kubenswrapper[4781]: E0227 00:07:34.309428 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:34 crc kubenswrapper[4781]: E0227 00:07:34.309532 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:36 crc kubenswrapper[4781]: I0227 00:07:36.309300 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:36 crc kubenswrapper[4781]: I0227 00:07:36.309410 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:36 crc kubenswrapper[4781]: I0227 00:07:36.309343 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:36 crc kubenswrapper[4781]: I0227 00:07:36.309499 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:36 crc kubenswrapper[4781]: E0227 00:07:36.309564 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:36 crc kubenswrapper[4781]: E0227 00:07:36.309731 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:36 crc kubenswrapper[4781]: E0227 00:07:36.309850 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:36 crc kubenswrapper[4781]: E0227 00:07:36.309913 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:36 crc kubenswrapper[4781]: E0227 00:07:36.420399 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:07:38 crc kubenswrapper[4781]: I0227 00:07:38.309126 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:38 crc kubenswrapper[4781]: I0227 00:07:38.309213 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:38 crc kubenswrapper[4781]: I0227 00:07:38.309250 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:38 crc kubenswrapper[4781]: E0227 00:07:38.309405 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:38 crc kubenswrapper[4781]: I0227 00:07:38.309559 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:38 crc kubenswrapper[4781]: E0227 00:07:38.309778 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:38 crc kubenswrapper[4781]: E0227 00:07:38.309938 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:38 crc kubenswrapper[4781]: E0227 00:07:38.310676 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:38 crc kubenswrapper[4781]: I0227 00:07:38.311298 4781 scope.go:117] "RemoveContainer" containerID="5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.377576 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/2.log" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.378690 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/1.log" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.382702 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807" exitCode=1 Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.382772 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807"} Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.382832 4781 scope.go:117] "RemoveContainer" containerID="5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.383970 4781 scope.go:117] "RemoveContainer" containerID="cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807" Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.384220 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.401886 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.425937 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.446320 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.465309 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.482043 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.496962 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.526547 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.544198 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.563526 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.583219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.583307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.583324 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.583346 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.583361 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:39Z","lastTransitionTime":"2026-02-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.586580 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.606760 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.610506 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.611561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.611617 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.611656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.611679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.611696 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:39Z","lastTransitionTime":"2026-02-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.629819 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.635038 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.635077 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.635089 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.635106 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.635117 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:39Z","lastTransitionTime":"2026-02-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.635310 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.650580 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.654535 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.659320 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.659364 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.659379 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.659400 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.659415 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:39Z","lastTransitionTime":"2026-02-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.667364 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.680784 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.685673 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.685707 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.685719 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.685737 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.685769 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:39Z","lastTransitionTime":"2026-02-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.695152 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.704540 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.704726 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.710194 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:40 crc kubenswrapper[4781]: I0227 00:07:40.308724 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:40 crc kubenswrapper[4781]: I0227 00:07:40.308807 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:40 crc kubenswrapper[4781]: I0227 00:07:40.308743 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:40 crc kubenswrapper[4781]: E0227 00:07:40.308906 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:40 crc kubenswrapper[4781]: I0227 00:07:40.308992 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:40 crc kubenswrapper[4781]: E0227 00:07:40.309169 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:40 crc kubenswrapper[4781]: E0227 00:07:40.309227 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:40 crc kubenswrapper[4781]: E0227 00:07:40.309264 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:40 crc kubenswrapper[4781]: I0227 00:07:40.390155 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/2.log" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.328161 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.349400 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.365598 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.383172 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.415943 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: E0227 00:07:41.421034 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.438398 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.454453 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.479872 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.496873 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.516341 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.532911 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.544256 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.554200 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.584346 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.607871 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.622977 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.046465 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.046733 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.046859 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:58.046830029 +0000 UTC m=+147.304369613 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.147268 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.147378 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147436 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:08:14.147403234 +0000 UTC m=+163.404942828 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147480 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.147493 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.147575 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147610 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:08:14.147586249 +0000 UTC m=+163.405125833 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.147672 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147827 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147883 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:08:14.147868737 +0000 UTC m=+163.405408331 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147968 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147992 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.148010 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.148057 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:08:14.148042551 +0000 UTC m=+163.405582135 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.148137 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.148189 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.148205 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.148248 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:08:14.148235707 +0000 UTC m=+163.405775291 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.309089 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.309149 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.309172 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.309090 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.309438 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.309566 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.310252 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.310763 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.321930 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.245409 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.247081 4781 scope.go:117] "RemoveContainer" containerID="cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807" Feb 27 00:07:43 crc kubenswrapper[4781]: E0227 00:07:43.247375 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.271222 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.299264 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.313414 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.327872 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.354286 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.372777 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.386051 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.400714 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.413644 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.425442 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.438898 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.448962 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.469535 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.480936 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.490528 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.501532 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.512192 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:44 crc kubenswrapper[4781]: I0227 00:07:44.308495 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:44 crc kubenswrapper[4781]: E0227 00:07:44.308611 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:44 crc kubenswrapper[4781]: I0227 00:07:44.308668 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:44 crc kubenswrapper[4781]: I0227 00:07:44.308777 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:44 crc kubenswrapper[4781]: E0227 00:07:44.308867 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:44 crc kubenswrapper[4781]: E0227 00:07:44.309013 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:44 crc kubenswrapper[4781]: I0227 00:07:44.309322 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:44 crc kubenswrapper[4781]: E0227 00:07:44.309484 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:46 crc kubenswrapper[4781]: I0227 00:07:46.308733 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:46 crc kubenswrapper[4781]: I0227 00:07:46.308804 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:46 crc kubenswrapper[4781]: I0227 00:07:46.308809 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:46 crc kubenswrapper[4781]: I0227 00:07:46.308924 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:46 crc kubenswrapper[4781]: E0227 00:07:46.308915 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:46 crc kubenswrapper[4781]: E0227 00:07:46.309067 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:46 crc kubenswrapper[4781]: E0227 00:07:46.309205 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:46 crc kubenswrapper[4781]: E0227 00:07:46.309370 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:46 crc kubenswrapper[4781]: E0227 00:07:46.422128 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:07:48 crc kubenswrapper[4781]: I0227 00:07:48.308760 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:48 crc kubenswrapper[4781]: I0227 00:07:48.308941 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:48 crc kubenswrapper[4781]: I0227 00:07:48.309060 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:48 crc kubenswrapper[4781]: E0227 00:07:48.308963 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:48 crc kubenswrapper[4781]: E0227 00:07:48.309211 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:48 crc kubenswrapper[4781]: E0227 00:07:48.309296 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:48 crc kubenswrapper[4781]: I0227 00:07:48.308802 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:48 crc kubenswrapper[4781]: E0227 00:07:48.310365 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:49 crc kubenswrapper[4781]: I0227 00:07:49.999458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:49 crc kubenswrapper[4781]: I0227 00:07:49.999934 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:49.999955 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:49.999983 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.000002 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:49Z","lastTransitionTime":"2026-02-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.018283 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.024100 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.024156 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.024178 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.024203 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.024221 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:50Z","lastTransitionTime":"2026-02-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.044425 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.048809 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.048873 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.048892 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.048916 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.048940 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:50Z","lastTransitionTime":"2026-02-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.068204 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.073275 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.073307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.073319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.073334 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.073345 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:50Z","lastTransitionTime":"2026-02-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.093126 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.097787 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.097836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.097851 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.097869 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.097884 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:50Z","lastTransitionTime":"2026-02-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.118273 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.118506 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.309045 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.309163 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.309065 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.309230 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.309065 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.309237 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.309339 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.309415 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.332992 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.355932 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.373806 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.391123 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.408843 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: E0227 00:07:51.422731 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.438264 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.456624 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.472355 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.488613 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.507253 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.521964 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.533766 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.548953 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.578887 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.595677 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.614991 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.640150 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:52 crc kubenswrapper[4781]: I0227 00:07:52.309231 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:52 crc kubenswrapper[4781]: I0227 00:07:52.309262 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:52 crc kubenswrapper[4781]: E0227 00:07:52.309954 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:52 crc kubenswrapper[4781]: I0227 00:07:52.309358 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:52 crc kubenswrapper[4781]: E0227 00:07:52.310357 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:52 crc kubenswrapper[4781]: I0227 00:07:52.309294 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:52 crc kubenswrapper[4781]: E0227 00:07:52.310772 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:52 crc kubenswrapper[4781]: E0227 00:07:52.310003 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:53 crc kubenswrapper[4781]: I0227 00:07:53.326144 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 27 00:07:54 crc kubenswrapper[4781]: I0227 00:07:54.309033 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:54 crc kubenswrapper[4781]: I0227 00:07:54.309038 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:54 crc kubenswrapper[4781]: I0227 00:07:54.309166 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:54 crc kubenswrapper[4781]: I0227 00:07:54.309178 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:54 crc kubenswrapper[4781]: E0227 00:07:54.309384 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:54 crc kubenswrapper[4781]: E0227 00:07:54.309517 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:54 crc kubenswrapper[4781]: E0227 00:07:54.309752 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:54 crc kubenswrapper[4781]: E0227 00:07:54.309916 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:56 crc kubenswrapper[4781]: I0227 00:07:56.308569 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:56 crc kubenswrapper[4781]: I0227 00:07:56.308616 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:56 crc kubenswrapper[4781]: I0227 00:07:56.308714 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:56 crc kubenswrapper[4781]: I0227 00:07:56.308671 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:56 crc kubenswrapper[4781]: E0227 00:07:56.308832 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:56 crc kubenswrapper[4781]: E0227 00:07:56.308963 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:56 crc kubenswrapper[4781]: E0227 00:07:56.309103 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:56 crc kubenswrapper[4781]: E0227 00:07:56.309273 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:56 crc kubenswrapper[4781]: E0227 00:07:56.424527 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:07:58 crc kubenswrapper[4781]: I0227 00:07:58.131735 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.131942 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.132060 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:08:30.1320274 +0000 UTC m=+179.389566984 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:58 crc kubenswrapper[4781]: I0227 00:07:58.309276 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:58 crc kubenswrapper[4781]: I0227 00:07:58.309291 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:58 crc kubenswrapper[4781]: I0227 00:07:58.309347 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:58 crc kubenswrapper[4781]: I0227 00:07:58.309436 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.309586 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.309845 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.310073 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.310755 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:58 crc kubenswrapper[4781]: I0227 00:07:58.311153 4781 scope.go:117] "RemoveContainer" containerID="cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807" Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.311465 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.308792 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.308928 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.309002 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.308823 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.309098 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.309230 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.309304 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.309436 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.441953 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.442020 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.442032 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.442049 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.442064 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:00Z","lastTransitionTime":"2026-02-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.456529 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.460592 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.460663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.460715 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.460744 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.460757 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:00Z","lastTransitionTime":"2026-02-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.468806 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/0.log" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.468854 4781 generic.go:334] "Generic (PLEG): container finished" podID="9a6dd1e0-45ab-46f0-b298-d89e47aaeecb" containerID="3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608" exitCode=1 Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.468885 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerDied","Data":"3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608"} Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.469261 4781 scope.go:117] "RemoveContainer" containerID="3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.473384 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.477402 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.477429 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.477439 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.477452 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.477460 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:00Z","lastTransitionTime":"2026-02-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.485351 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.491004 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.495687 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.495733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.495753 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.495777 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.495796 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:00Z","lastTransitionTime":"2026-02-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.500039 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.509354 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.512850 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.513333 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.513365 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.513377 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.513393 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.513405 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:00Z","lastTransitionTime":"2026-02-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.526256 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.526436 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.535078 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.548152 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.566123 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.577668 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.587995 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.598561 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.613374 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.623027 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.632283 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.648830 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.664662 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.675854 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.688131 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.701674 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.712127 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.327898 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.344405 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.360765 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.383261 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.399242 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.414563 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: E0227 00:08:01.425244 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.434431 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.445684 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.459229 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.472859 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/0.log" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.472905 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerStarted","Data":"3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606"} Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.476818 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.496677 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.515916 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.533377 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.551396 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.569009 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.582053 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.595886 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.618147 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.629709 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.639607 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.659540 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.676767 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.693445 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.713618 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.728287 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.752522 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.765550 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.778342 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.791934 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.813605 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.833175 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.847752 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.868532 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.888604 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.904131 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.922276 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:02 crc kubenswrapper[4781]: I0227 00:08:02.309136 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:02 crc kubenswrapper[4781]: I0227 00:08:02.309171 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:02 crc kubenswrapper[4781]: I0227 00:08:02.309159 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:02 crc kubenswrapper[4781]: I0227 00:08:02.309248 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:02 crc kubenswrapper[4781]: E0227 00:08:02.309431 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:02 crc kubenswrapper[4781]: E0227 00:08:02.309585 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:02 crc kubenswrapper[4781]: E0227 00:08:02.309809 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:02 crc kubenswrapper[4781]: E0227 00:08:02.310008 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:04 crc kubenswrapper[4781]: I0227 00:08:04.308617 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:04 crc kubenswrapper[4781]: I0227 00:08:04.308762 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:04 crc kubenswrapper[4781]: I0227 00:08:04.308698 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:04 crc kubenswrapper[4781]: I0227 00:08:04.308617 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:04 crc kubenswrapper[4781]: E0227 00:08:04.308956 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:04 crc kubenswrapper[4781]: E0227 00:08:04.309069 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:04 crc kubenswrapper[4781]: E0227 00:08:04.309180 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:04 crc kubenswrapper[4781]: E0227 00:08:04.309388 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:06 crc kubenswrapper[4781]: I0227 00:08:06.309186 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:06 crc kubenswrapper[4781]: I0227 00:08:06.309225 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:06 crc kubenswrapper[4781]: I0227 00:08:06.309295 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:06 crc kubenswrapper[4781]: E0227 00:08:06.309390 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:06 crc kubenswrapper[4781]: I0227 00:08:06.309407 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:06 crc kubenswrapper[4781]: E0227 00:08:06.309555 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:06 crc kubenswrapper[4781]: E0227 00:08:06.309819 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:06 crc kubenswrapper[4781]: E0227 00:08:06.309938 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:06 crc kubenswrapper[4781]: E0227 00:08:06.426797 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:08 crc kubenswrapper[4781]: I0227 00:08:08.308519 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:08 crc kubenswrapper[4781]: I0227 00:08:08.308577 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:08 crc kubenswrapper[4781]: I0227 00:08:08.308543 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:08 crc kubenswrapper[4781]: E0227 00:08:08.308674 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:08 crc kubenswrapper[4781]: I0227 00:08:08.308680 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:08 crc kubenswrapper[4781]: E0227 00:08:08.308776 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:08 crc kubenswrapper[4781]: E0227 00:08:08.308966 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:08 crc kubenswrapper[4781]: E0227 00:08:08.309076 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.309359 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.309449 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.309467 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.309512 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.309730 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.309719 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.309792 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.309869 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.860956 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.860999 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.861010 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.861028 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.861040 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:10Z","lastTransitionTime":"2026-02-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.877653 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.882216 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.882275 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.882294 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.882319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.882335 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:10Z","lastTransitionTime":"2026-02-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.900851 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.905120 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.905168 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.905187 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.905209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.905227 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:10Z","lastTransitionTime":"2026-02-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.922776 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.927695 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.927846 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.927866 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.927890 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.927909 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:10Z","lastTransitionTime":"2026-02-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.948734 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.953955 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.954004 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.954015 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.954031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.954043 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:10Z","lastTransitionTime":"2026-02-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.973379 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.973746 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.321760 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.339124 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.351356 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.361147 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.372296 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.382920 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.394525 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.405385 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.415874 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: E0227 00:08:11.427336 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.427429 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.444973 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.463123 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.474106 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.488603 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.498951 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.513438 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.531089 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.547206 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:12 crc kubenswrapper[4781]: I0227 00:08:12.308918 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:12 crc kubenswrapper[4781]: I0227 00:08:12.309003 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:12 crc kubenswrapper[4781]: I0227 00:08:12.309099 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:12 crc kubenswrapper[4781]: E0227 00:08:12.309395 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:12 crc kubenswrapper[4781]: I0227 00:08:12.309436 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:12 crc kubenswrapper[4781]: E0227 00:08:12.309692 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:12 crc kubenswrapper[4781]: E0227 00:08:12.309794 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:12 crc kubenswrapper[4781]: E0227 00:08:12.309923 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.309934 4781 scope.go:117] "RemoveContainer" containerID="cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.522500 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/2.log" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.526823 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.527245 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.542896 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.555032 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.569059 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.579445 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.599669 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.614301 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.629376 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.645823 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.660517 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.676859 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.700661 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.716643 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.740854 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.775757 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.798961 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.823676 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.843133 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.856490 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.204171 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.204277 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.204305 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.204346 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.204379 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204487 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204539 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:18.204523918 +0000 UTC m=+227.462063482 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204758 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:18.204747784 +0000 UTC m=+227.462287358 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204824 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204837 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204852 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204881 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:18.204872837 +0000 UTC m=+227.462412401 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204894 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204928 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204966 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204979 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.205021 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:18.20499408 +0000 UTC m=+227.462533664 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.205052 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:18.205037541 +0000 UTC m=+227.462577125 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.308941 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.308983 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.309019 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.309083 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.309516 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.309905 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.310074 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.310102 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.538555 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/3.log" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.539558 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/2.log" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.543992 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" exitCode=1 Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.544045 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.544099 4781 scope.go:117] "RemoveContainer" containerID="cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.545140 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.545409 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.573136 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.591785 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.607171 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.624529 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.637759 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.654393 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.668958 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.683530 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.696941 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.713568 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.728277 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.745712 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.760129 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.771249 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.792379 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:08:14Z\\\",\\\"message\\\":\\\"ch for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0227 00:08:14.171587 7345 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.807788 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.822050 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.834342 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.548115 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/3.log" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.552258 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:08:15 crc kubenswrapper[4781]: E0227 00:08:15.552532 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.567890 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.580290 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.591184 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.603074 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.613933 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.626886 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.638699 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.647077 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.664294 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:08:14Z\\\",\\\"message\\\":\\\"ch for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0227 00:08:14.171587 7345 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:08:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.674573 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.685971 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.695752 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.715143 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.728137 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.742114 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.757666 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.767211 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.778427 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:16 crc kubenswrapper[4781]: I0227 00:08:16.309224 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:16 crc kubenswrapper[4781]: I0227 00:08:16.309299 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:16 crc kubenswrapper[4781]: I0227 00:08:16.309260 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:16 crc kubenswrapper[4781]: E0227 00:08:16.309411 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:16 crc kubenswrapper[4781]: I0227 00:08:16.309386 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:16 crc kubenswrapper[4781]: E0227 00:08:16.309568 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:16 crc kubenswrapper[4781]: E0227 00:08:16.309606 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:16 crc kubenswrapper[4781]: E0227 00:08:16.309684 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:16 crc kubenswrapper[4781]: E0227 00:08:16.429158 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:18 crc kubenswrapper[4781]: I0227 00:08:18.308753 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:18 crc kubenswrapper[4781]: I0227 00:08:18.308798 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:18 crc kubenswrapper[4781]: E0227 00:08:18.309321 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:18 crc kubenswrapper[4781]: I0227 00:08:18.308991 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:18 crc kubenswrapper[4781]: I0227 00:08:18.308925 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:18 crc kubenswrapper[4781]: E0227 00:08:18.309430 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:18 crc kubenswrapper[4781]: E0227 00:08:18.309517 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:18 crc kubenswrapper[4781]: E0227 00:08:18.309700 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:20 crc kubenswrapper[4781]: I0227 00:08:20.309193 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:20 crc kubenswrapper[4781]: I0227 00:08:20.309317 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:20 crc kubenswrapper[4781]: I0227 00:08:20.309193 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:20 crc kubenswrapper[4781]: E0227 00:08:20.309331 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:20 crc kubenswrapper[4781]: I0227 00:08:20.309214 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:20 crc kubenswrapper[4781]: E0227 00:08:20.309531 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:20 crc kubenswrapper[4781]: E0227 00:08:20.309586 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:20 crc kubenswrapper[4781]: E0227 00:08:20.309696 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.319942 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.346436 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:08:14Z\\\",\\\"message\\\":\\\"ch for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0227 00:08:14.171587 7345 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:08:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.347187 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.347218 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.347227 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.347241 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.347251 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:21Z","lastTransitionTime":"2026-02-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.363876 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.366931 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.370798 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.370838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.370850 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.370868 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.370882 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:21Z","lastTransitionTime":"2026-02-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.381164 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.383676 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.387680 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.387709 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.387720 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.387736 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.387745 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:21Z","lastTransitionTime":"2026-02-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.404377 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.416940 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.421586 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.421710 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.421816 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.422146 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.422337 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:21Z","lastTransitionTime":"2026-02-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.426825 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.429887 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.441557 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.445215 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.449014 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.449055 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.449070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.449089 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.449100 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:21Z","lastTransitionTime":"2026-02-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.463116 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.467185 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.467429 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.480210 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.498618 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.530858 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.546502 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.560255 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.576246 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.586447 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.600281 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.613782 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.628672 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:22 crc kubenswrapper[4781]: I0227 00:08:22.308697 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:22 crc kubenswrapper[4781]: I0227 00:08:22.308756 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:22 crc kubenswrapper[4781]: I0227 00:08:22.308712 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:22 crc kubenswrapper[4781]: I0227 00:08:22.308871 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:22 crc kubenswrapper[4781]: E0227 00:08:22.309137 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:22 crc kubenswrapper[4781]: E0227 00:08:22.309241 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:22 crc kubenswrapper[4781]: E0227 00:08:22.309329 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:22 crc kubenswrapper[4781]: E0227 00:08:22.309370 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:23 crc kubenswrapper[4781]: I0227 00:08:23.324236 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 27 00:08:24 crc kubenswrapper[4781]: I0227 00:08:24.309082 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:24 crc kubenswrapper[4781]: I0227 00:08:24.309512 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:24 crc kubenswrapper[4781]: E0227 00:08:24.309496 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:24 crc kubenswrapper[4781]: I0227 00:08:24.309594 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:24 crc kubenswrapper[4781]: I0227 00:08:24.309870 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:24 crc kubenswrapper[4781]: E0227 00:08:24.309917 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:24 crc kubenswrapper[4781]: E0227 00:08:24.310002 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:24 crc kubenswrapper[4781]: E0227 00:08:24.310203 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:26 crc kubenswrapper[4781]: I0227 00:08:26.309015 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:26 crc kubenswrapper[4781]: I0227 00:08:26.309192 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:26 crc kubenswrapper[4781]: E0227 00:08:26.309285 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:26 crc kubenswrapper[4781]: I0227 00:08:26.309237 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:26 crc kubenswrapper[4781]: I0227 00:08:26.309059 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:26 crc kubenswrapper[4781]: E0227 00:08:26.309527 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:26 crc kubenswrapper[4781]: E0227 00:08:26.309668 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:26 crc kubenswrapper[4781]: E0227 00:08:26.309835 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:26 crc kubenswrapper[4781]: E0227 00:08:26.435464 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:28 crc kubenswrapper[4781]: I0227 00:08:28.309295 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:28 crc kubenswrapper[4781]: I0227 00:08:28.309400 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:28 crc kubenswrapper[4781]: E0227 00:08:28.309490 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:28 crc kubenswrapper[4781]: I0227 00:08:28.309295 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:28 crc kubenswrapper[4781]: E0227 00:08:28.309603 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:28 crc kubenswrapper[4781]: I0227 00:08:28.309332 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:28 crc kubenswrapper[4781]: E0227 00:08:28.309726 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:28 crc kubenswrapper[4781]: E0227 00:08:28.309769 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:29 crc kubenswrapper[4781]: I0227 00:08:29.310896 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:08:29 crc kubenswrapper[4781]: E0227 00:08:29.311174 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:08:30 crc kubenswrapper[4781]: I0227 00:08:30.182459 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:30 crc kubenswrapper[4781]: E0227 00:08:30.182576 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:08:30 crc kubenswrapper[4781]: E0227 00:08:30.182654 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:34.182611403 +0000 UTC m=+243.440150957 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:08:30 crc kubenswrapper[4781]: I0227 00:08:30.309361 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:30 crc kubenswrapper[4781]: E0227 00:08:30.309506 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:30 crc kubenswrapper[4781]: I0227 00:08:30.309718 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:30 crc kubenswrapper[4781]: I0227 00:08:30.309745 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:30 crc kubenswrapper[4781]: E0227 00:08:30.309817 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:30 crc kubenswrapper[4781]: I0227 00:08:30.309879 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:30 crc kubenswrapper[4781]: E0227 00:08:30.309886 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:30 crc kubenswrapper[4781]: E0227 00:08:30.310114 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.334709 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d2xt9" podStartSLOduration=125.334679968 podStartE2EDuration="2m5.334679968s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.332184603 +0000 UTC m=+180.589724157" watchObservedRunningTime="2026-02-27 00:08:31.334679968 +0000 UTC m=+180.592219562" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.378399 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=38.378377497 podStartE2EDuration="38.378377497s" podCreationTimestamp="2026-02-27 00:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.37774199 +0000 UTC m=+180.635281574" watchObservedRunningTime="2026-02-27 00:08:31.378377497 +0000 UTC m=+180.635917061" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.390719 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=49.390675274 podStartE2EDuration="49.390675274s" podCreationTimestamp="2026-02-27 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.390316095 +0000 UTC m=+180.647855649" watchObservedRunningTime="2026-02-27 00:08:31.390675274 +0000 UTC m=+180.648214828" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.421901 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=8.421881911 podStartE2EDuration="8.421881911s" podCreationTimestamp="2026-02-27 00:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.42186034 +0000 UTC m=+180.679399954" watchObservedRunningTime="2026-02-27 00:08:31.421881911 +0000 UTC m=+180.679421465" Feb 27 00:08:31 crc kubenswrapper[4781]: E0227 00:08:31.436913 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.484251 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.484312 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.484474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.486311 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.486366 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:31Z","lastTransitionTime":"2026-02-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.523407 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" podStartSLOduration=125.523387683 podStartE2EDuration="2m5.523387683s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.522423908 +0000 UTC m=+180.779963462" watchObservedRunningTime="2026-02-27 00:08:31.523387683 +0000 UTC m=+180.780927237" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.526267 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv"] Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.526731 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.530028 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.530159 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.530227 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.530309 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.547831 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=70.547815464 podStartE2EDuration="1m10.547815464s" podCreationTimestamp="2026-02-27 00:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.547235819 +0000 UTC m=+180.804775373" watchObservedRunningTime="2026-02-27 00:08:31.547815464 +0000 UTC m=+180.805355018" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.570978 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=71.570948142 podStartE2EDuration="1m11.570948142s" podCreationTimestamp="2026-02-27 00:07:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.57085793 +0000 UTC m=+180.828397484" watchObservedRunningTime="2026-02-27 00:08:31.570948142 +0000 UTC m=+180.828487696" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.584390 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tlstj" podStartSLOduration=125.584367389 podStartE2EDuration="2m5.584367389s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.583519017 +0000 UTC m=+180.841058581" watchObservedRunningTime="2026-02-27 00:08:31.584367389 +0000 UTC m=+180.841906953" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.596715 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dd29f3f-2201-4879-a479-3f6a0ed912a5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.596777 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4dd29f3f-2201-4879-a479-3f6a0ed912a5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.596793 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4dd29f3f-2201-4879-a479-3f6a0ed912a5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.596813 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4dd29f3f-2201-4879-a479-3f6a0ed912a5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.596983 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd29f3f-2201-4879-a479-3f6a0ed912a5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.611248 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rc856" podStartSLOduration=125.611231683 podStartE2EDuration="2m5.611231683s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.610904714 +0000 UTC m=+180.868444298" watchObservedRunningTime="2026-02-27 00:08:31.611231683 +0000 UTC m=+180.868771267" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.611800 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" podStartSLOduration=125.611790417 podStartE2EDuration="2m5.611790417s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.601395979 +0000 UTC m=+180.858935553" watchObservedRunningTime="2026-02-27 00:08:31.611790417 +0000 UTC m=+180.869329981" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.661298 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podStartSLOduration=125.661276756 podStartE2EDuration="2m5.661276756s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.651070522 +0000 UTC m=+180.908610086" watchObservedRunningTime="2026-02-27 00:08:31.661276756 +0000 UTC m=+180.918816310" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698245 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4dd29f3f-2201-4879-a479-3f6a0ed912a5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698321 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4dd29f3f-2201-4879-a479-3f6a0ed912a5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698363 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4dd29f3f-2201-4879-a479-3f6a0ed912a5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698422 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd29f3f-2201-4879-a479-3f6a0ed912a5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698436 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4dd29f3f-2201-4879-a479-3f6a0ed912a5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698480 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4dd29f3f-2201-4879-a479-3f6a0ed912a5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698451 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dd29f3f-2201-4879-a479-3f6a0ed912a5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.699276 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4dd29f3f-2201-4879-a479-3f6a0ed912a5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.703790 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd29f3f-2201-4879-a479-3f6a0ed912a5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.714868 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dd29f3f-2201-4879-a479-3f6a0ed912a5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.839247 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: W0227 00:08:31.863467 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd29f3f_2201_4879_a479_3f6a0ed912a5.slice/crio-bbb84661c7519ddd74baa960cadc55bf8c7757bc3ee10e5d9364a41bbccba7c7 WatchSource:0}: Error finding container bbb84661c7519ddd74baa960cadc55bf8c7757bc3ee10e5d9364a41bbccba7c7: Status 404 returned error can't find the container with id bbb84661c7519ddd74baa960cadc55bf8c7757bc3ee10e5d9364a41bbccba7c7 Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.308833 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:32 crc kubenswrapper[4781]: E0227 00:08:32.309334 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.308922 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.308900 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:32 crc kubenswrapper[4781]: E0227 00:08:32.309476 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.308984 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:32 crc kubenswrapper[4781]: E0227 00:08:32.309908 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:32 crc kubenswrapper[4781]: E0227 00:08:32.309862 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.401337 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.412179 4781 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.629353 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" event={"ID":"4dd29f3f-2201-4879-a479-3f6a0ed912a5","Type":"ContainerStarted","Data":"949e7e480620353250e0403b1a0fb8c3d204ec52dd6e02c407deef70af34a2ba"} Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.629428 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" event={"ID":"4dd29f3f-2201-4879-a479-3f6a0ed912a5","Type":"ContainerStarted","Data":"bbb84661c7519ddd74baa960cadc55bf8c7757bc3ee10e5d9364a41bbccba7c7"} Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.642442 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" podStartSLOduration=126.642426224 podStartE2EDuration="2m6.642426224s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:32.641964242 +0000 UTC m=+181.899503826" watchObservedRunningTime="2026-02-27 00:08:32.642426224 +0000 UTC m=+181.899965798" Feb 27 00:08:34 crc kubenswrapper[4781]: I0227 00:08:34.308678 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:34 crc kubenswrapper[4781]: I0227 00:08:34.308790 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:34 crc kubenswrapper[4781]: E0227 00:08:34.308809 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:34 crc kubenswrapper[4781]: I0227 00:08:34.308945 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:34 crc kubenswrapper[4781]: E0227 00:08:34.309049 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:34 crc kubenswrapper[4781]: E0227 00:08:34.309299 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:34 crc kubenswrapper[4781]: I0227 00:08:34.309834 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:34 crc kubenswrapper[4781]: E0227 00:08:34.310001 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:36 crc kubenswrapper[4781]: I0227 00:08:36.308351 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:36 crc kubenswrapper[4781]: I0227 00:08:36.308413 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:36 crc kubenswrapper[4781]: I0227 00:08:36.308459 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:36 crc kubenswrapper[4781]: E0227 00:08:36.308565 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:36 crc kubenswrapper[4781]: I0227 00:08:36.308614 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:36 crc kubenswrapper[4781]: E0227 00:08:36.308718 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:36 crc kubenswrapper[4781]: E0227 00:08:36.308938 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:36 crc kubenswrapper[4781]: E0227 00:08:36.309013 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:36 crc kubenswrapper[4781]: E0227 00:08:36.438576 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:38 crc kubenswrapper[4781]: I0227 00:08:38.308374 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:38 crc kubenswrapper[4781]: I0227 00:08:38.308445 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:38 crc kubenswrapper[4781]: I0227 00:08:38.308480 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:38 crc kubenswrapper[4781]: E0227 00:08:38.308582 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:38 crc kubenswrapper[4781]: E0227 00:08:38.308752 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:38 crc kubenswrapper[4781]: I0227 00:08:38.308806 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:38 crc kubenswrapper[4781]: E0227 00:08:38.308920 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:38 crc kubenswrapper[4781]: E0227 00:08:38.308958 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:40 crc kubenswrapper[4781]: I0227 00:08:40.309306 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:40 crc kubenswrapper[4781]: I0227 00:08:40.309421 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:40 crc kubenswrapper[4781]: I0227 00:08:40.309455 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:40 crc kubenswrapper[4781]: I0227 00:08:40.309449 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:40 crc kubenswrapper[4781]: E0227 00:08:40.309828 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:40 crc kubenswrapper[4781]: E0227 00:08:40.310300 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:40 crc kubenswrapper[4781]: E0227 00:08:40.310430 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:40 crc kubenswrapper[4781]: E0227 00:08:40.310601 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:41 crc kubenswrapper[4781]: E0227 00:08:41.439219 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:42 crc kubenswrapper[4781]: I0227 00:08:42.308654 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:42 crc kubenswrapper[4781]: I0227 00:08:42.308749 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:42 crc kubenswrapper[4781]: I0227 00:08:42.308749 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:42 crc kubenswrapper[4781]: I0227 00:08:42.308673 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:42 crc kubenswrapper[4781]: E0227 00:08:42.308891 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:42 crc kubenswrapper[4781]: E0227 00:08:42.309040 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:42 crc kubenswrapper[4781]: E0227 00:08:42.309125 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:42 crc kubenswrapper[4781]: E0227 00:08:42.309337 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:43 crc kubenswrapper[4781]: I0227 00:08:43.310249 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:08:43 crc kubenswrapper[4781]: E0227 00:08:43.310470 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:08:44 crc kubenswrapper[4781]: I0227 00:08:44.308975 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:44 crc kubenswrapper[4781]: I0227 00:08:44.309055 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:44 crc kubenswrapper[4781]: E0227 00:08:44.309157 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:44 crc kubenswrapper[4781]: I0227 00:08:44.308997 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:44 crc kubenswrapper[4781]: I0227 00:08:44.309271 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:44 crc kubenswrapper[4781]: E0227 00:08:44.309570 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:44 crc kubenswrapper[4781]: E0227 00:08:44.309888 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:44 crc kubenswrapper[4781]: E0227 00:08:44.309754 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.308841 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.308974 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.309012 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:46 crc kubenswrapper[4781]: E0227 00:08:46.309009 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.309077 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:46 crc kubenswrapper[4781]: E0227 00:08:46.309238 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:46 crc kubenswrapper[4781]: E0227 00:08:46.309363 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:46 crc kubenswrapper[4781]: E0227 00:08:46.309457 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:46 crc kubenswrapper[4781]: E0227 00:08:46.440874 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.679235 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/1.log" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.679890 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/0.log" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.679950 4781 generic.go:334] "Generic (PLEG): container finished" podID="9a6dd1e0-45ab-46f0-b298-d89e47aaeecb" containerID="3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606" exitCode=1 Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.679986 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerDied","Data":"3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606"} Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.680030 4781 scope.go:117] "RemoveContainer" containerID="3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.680794 4781 scope.go:117] "RemoveContainer" containerID="3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606" Feb 27 00:08:46 crc kubenswrapper[4781]: E0227 00:08:46.681382 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-tlstj_openshift-multus(9a6dd1e0-45ab-46f0-b298-d89e47aaeecb)\"" pod="openshift-multus/multus-tlstj" podUID="9a6dd1e0-45ab-46f0-b298-d89e47aaeecb" Feb 27 00:08:47 crc kubenswrapper[4781]: I0227 00:08:47.692704 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/1.log" Feb 27 00:08:48 crc kubenswrapper[4781]: I0227 00:08:48.309195 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:48 crc kubenswrapper[4781]: I0227 00:08:48.309233 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:48 crc kubenswrapper[4781]: I0227 00:08:48.309401 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:48 crc kubenswrapper[4781]: E0227 00:08:48.309406 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:48 crc kubenswrapper[4781]: I0227 00:08:48.309428 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:48 crc kubenswrapper[4781]: E0227 00:08:48.309481 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:48 crc kubenswrapper[4781]: E0227 00:08:48.309563 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:48 crc kubenswrapper[4781]: E0227 00:08:48.309827 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:50 crc kubenswrapper[4781]: I0227 00:08:50.308607 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:50 crc kubenswrapper[4781]: I0227 00:08:50.308717 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:50 crc kubenswrapper[4781]: I0227 00:08:50.308741 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:50 crc kubenswrapper[4781]: E0227 00:08:50.308795 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:50 crc kubenswrapper[4781]: I0227 00:08:50.308821 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:50 crc kubenswrapper[4781]: E0227 00:08:50.308932 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:50 crc kubenswrapper[4781]: E0227 00:08:50.309029 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:50 crc kubenswrapper[4781]: E0227 00:08:50.309148 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:51 crc kubenswrapper[4781]: E0227 00:08:51.441566 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:52 crc kubenswrapper[4781]: I0227 00:08:52.308532 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:52 crc kubenswrapper[4781]: I0227 00:08:52.308715 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:52 crc kubenswrapper[4781]: I0227 00:08:52.308755 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:52 crc kubenswrapper[4781]: I0227 00:08:52.308812 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:52 crc kubenswrapper[4781]: E0227 00:08:52.308717 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:52 crc kubenswrapper[4781]: E0227 00:08:52.308927 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:52 crc kubenswrapper[4781]: E0227 00:08:52.309101 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:52 crc kubenswrapper[4781]: E0227 00:08:52.309193 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:54 crc kubenswrapper[4781]: I0227 00:08:54.308720 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:54 crc kubenswrapper[4781]: I0227 00:08:54.308779 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:54 crc kubenswrapper[4781]: I0227 00:08:54.308841 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:54 crc kubenswrapper[4781]: E0227 00:08:54.308912 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:54 crc kubenswrapper[4781]: I0227 00:08:54.308739 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:54 crc kubenswrapper[4781]: E0227 00:08:54.309091 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:54 crc kubenswrapper[4781]: E0227 00:08:54.309203 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:54 crc kubenswrapper[4781]: E0227 00:08:54.309289 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:55 crc kubenswrapper[4781]: I0227 00:08:55.309858 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:08:55 crc kubenswrapper[4781]: I0227 00:08:55.728394 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/3.log" Feb 27 00:08:55 crc kubenswrapper[4781]: I0227 00:08:55.732577 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} Feb 27 00:08:55 crc kubenswrapper[4781]: I0227 00:08:55.732939 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.309178 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.309282 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.309371 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:56 crc kubenswrapper[4781]: E0227 00:08:56.309500 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.309582 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:56 crc kubenswrapper[4781]: E0227 00:08:56.309716 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:56 crc kubenswrapper[4781]: E0227 00:08:56.309837 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:56 crc kubenswrapper[4781]: E0227 00:08:56.310023 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.384329 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podStartSLOduration=150.384300853 podStartE2EDuration="2m30.384300853s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:55.783938069 +0000 UTC m=+205.041477633" watchObservedRunningTime="2026-02-27 00:08:56.384300853 +0000 UTC m=+205.641840457" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.385710 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpnjj"] Feb 27 00:08:56 crc kubenswrapper[4781]: E0227 00:08:56.442828 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.736173 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:56 crc kubenswrapper[4781]: E0227 00:08:56.736314 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:58 crc kubenswrapper[4781]: I0227 00:08:58.309400 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:58 crc kubenswrapper[4781]: I0227 00:08:58.309500 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:58 crc kubenswrapper[4781]: I0227 00:08:58.309408 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:58 crc kubenswrapper[4781]: I0227 00:08:58.309400 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:58 crc kubenswrapper[4781]: E0227 00:08:58.309589 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:58 crc kubenswrapper[4781]: E0227 00:08:58.309905 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:58 crc kubenswrapper[4781]: E0227 00:08:58.309948 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:58 crc kubenswrapper[4781]: E0227 00:08:58.310076 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:59 crc kubenswrapper[4781]: I0227 00:08:59.309585 4781 scope.go:117] "RemoveContainer" containerID="3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606" Feb 27 00:08:59 crc kubenswrapper[4781]: I0227 00:08:59.755140 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/1.log" Feb 27 00:08:59 crc kubenswrapper[4781]: I0227 00:08:59.755602 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerStarted","Data":"a286864c68415e96f38bba630ac2325989837881e34a926c93977715f330a129"} Feb 27 00:09:00 crc kubenswrapper[4781]: I0227 00:09:00.309189 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:09:00 crc kubenswrapper[4781]: I0227 00:09:00.309263 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:09:00 crc kubenswrapper[4781]: I0227 00:09:00.309277 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:09:00 crc kubenswrapper[4781]: I0227 00:09:00.309220 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:00 crc kubenswrapper[4781]: E0227 00:09:00.309430 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:09:00 crc kubenswrapper[4781]: E0227 00:09:00.309573 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:09:00 crc kubenswrapper[4781]: E0227 00:09:00.309695 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:09:00 crc kubenswrapper[4781]: E0227 00:09:00.309874 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.879205 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.931915 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cr2bb"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.933098 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.936947 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.937818 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.942754 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-29z97"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.943546 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.945065 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ktjdc"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.945564 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.945777 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.946022 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.948254 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.948540 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.948587 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.948885 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.948957 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.949188 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.949402 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.948905 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.949706 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.950450 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.950593 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.950761 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.950886 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.951078 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.951255 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.951989 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.952148 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.955803 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.956008 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.956175 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.956513 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.956554 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.956727 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.956932 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.957307 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.958247 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.958477 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.958723 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.962595 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.968588 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-vtsxv"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.969336 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.969923 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.970675 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.973698 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.974273 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-f4jxd"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.975014 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.982738 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.983107 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.983261 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.983411 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.983556 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.983713 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.983864 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.984841 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.990707 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.991147 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.991402 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.991553 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.991736 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.991905 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992023 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992145 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992263 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992432 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992549 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992692 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992809 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.993427 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.993903 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.997163 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.008664 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.015521 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.018996 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.071012 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qjwrj"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.071376 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fdkct"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.071675 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2zw27"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.071938 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.072198 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.072348 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078103 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-serving-cert\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078139 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrg6p\" (UniqueName: \"kubernetes.io/projected/a24423db-53f2-4555-81e4-228b3911e144-kube-api-access-xrg6p\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078157 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb172836-9833-43d5-a99b-cc01b3dd6694-config\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnbwb\" (UniqueName: \"kubernetes.io/projected/c7332c18-9748-49d2-b512-a46c2d1fcb79-kube-api-access-tnbwb\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078190 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-oauth-config\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078203 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-client\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078217 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77s24\" (UniqueName: \"kubernetes.io/projected/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-kube-api-access-77s24\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078233 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7332c18-9748-49d2-b512-a46c2d1fcb79-serving-cert\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078258 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-etcd-serving-ca\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078272 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-serving-cert\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078287 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb172836-9833-43d5-a99b-cc01b3dd6694-machine-approver-tls\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078304 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-client-ca\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078334 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078349 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-ca\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078362 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078377 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24423db-53f2-4555-81e4-228b3911e144-serving-cert\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078394 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjdbq\" (UniqueName: \"kubernetes.io/projected/76705148-274c-4428-9508-13fe1193646e-kube-api-access-xjdbq\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078409 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-etcd-client\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078425 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-image-import-ca\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078454 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-service-ca\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078475 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078497 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-oauth-serving-cert\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078526 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-serving-cert\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078548 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-service-ca\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078564 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-audit\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d9ce11ed-3022-47e0-8150-8af94af65076-audit-dir\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078592 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f8zb\" (UniqueName: \"kubernetes.io/projected/b9dadb6a-e49e-4473-8338-3af567aacb4a-kube-api-access-9f8zb\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078610 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-console-config\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078647 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-encryption-config\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078666 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d9ce11ed-3022-47e0-8150-8af94af65076-node-pullsecrets\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078685 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-config\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078706 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-config\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078721 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.079182 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.079394 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.079493 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.079694 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.079724 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080056 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078726 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grq2j\" (UniqueName: \"kubernetes.io/projected/d9ce11ed-3022-47e0-8150-8af94af65076-kube-api-access-grq2j\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080238 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-audit-policies\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080261 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf6dv\" (UniqueName: \"kubernetes.io/projected/cb172836-9833-43d5-a99b-cc01b3dd6694-kube-api-access-tf6dv\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080285 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-serving-cert\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080299 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080354 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080577 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-encryption-config\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080910 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080919 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-trusted-ca-bundle\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080964 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb172836-9833-43d5-a99b-cc01b3dd6694-auth-proxy-config\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.081018 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-config\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.081042 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-client-ca\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.081070 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-config\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.081130 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-etcd-client\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.081166 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9dadb6a-e49e-4473-8338-3af567aacb4a-audit-dir\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.082229 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d9gmh"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.082939 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9z8qr"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.082962 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.083268 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.088511 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.088969 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.090901 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zhrk"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.091531 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.096663 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.097193 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.099776 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tw95c"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.100220 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8lcg4"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.100407 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.100826 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.100945 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.101218 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.101248 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.101259 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.101578 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.101664 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.104965 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.105218 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.105384 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.105737 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.108939 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.109241 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29535840-t9tlz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.109784 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.109919 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.110218 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.112589 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.112747 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.113248 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.114969 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.115197 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.115365 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.117728 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.121202 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.122044 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.122273 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.122506 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.122816 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123017 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123169 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123305 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123426 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123584 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123742 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123848 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.124131 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.124253 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.124364 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.124464 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.128200 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.128352 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.128991 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.129080 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.129317 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.129500 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.130197 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.135276 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.148175 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.149456 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.150224 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.154912 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.157843 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.158942 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.163382 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.164289 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgpv7"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.164351 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.164643 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.164656 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.164902 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.165160 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.165764 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.166314 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.166486 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.166867 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.167188 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.167647 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.167789 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6rw4v"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.168473 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.168675 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.169397 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.170816 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.171210 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.172204 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.172591 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.175358 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.175952 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.176265 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kxcrw"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.177057 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.177546 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.178510 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.179416 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.179896 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535848-ccctv"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.180492 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182427 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-config\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182457 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d9ce11ed-3022-47e0-8150-8af94af65076-node-pullsecrets\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182481 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14579b3e-131e-4e98-b060-a93d2581479c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182501 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz5ss\" (UniqueName: \"kubernetes.io/projected/14579b3e-131e-4e98-b060-a93d2581479c-kube-api-access-cz5ss\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182522 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-config\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182537 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grq2j\" (UniqueName: \"kubernetes.io/projected/d9ce11ed-3022-47e0-8150-8af94af65076-kube-api-access-grq2j\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182569 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-audit-policies\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182585 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkqct\" (UniqueName: \"kubernetes.io/projected/44e0d81c-a6e7-4e95-9901-ea32b8476755-kube-api-access-dkqct\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182604 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/010c6a41-8e2d-4391-ac1b-82814dad98a4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8d9mv\" (UID: \"010c6a41-8e2d-4391-ac1b-82814dad98a4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182659 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/878b625f-d8df-457f-b208-f4bf5807a8d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182678 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf6dv\" (UniqueName: \"kubernetes.io/projected/cb172836-9833-43d5-a99b-cc01b3dd6694-kube-api-access-tf6dv\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.183987 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-serving-cert\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184013 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzxgj\" (UniqueName: \"kubernetes.io/projected/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-kube-api-access-pzxgj\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.183544 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-audit-policies\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.183819 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184030 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98d3eede-8852-4bf5-a905-25974e47445f-proxy-tls\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184114 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58vdh\" (UniqueName: \"kubernetes.io/projected/878b625f-d8df-457f-b208-f4bf5807a8d8-kube-api-access-58vdh\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184156 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-encryption-config\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184179 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-trusted-ca-bundle\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184202 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb172836-9833-43d5-a99b-cc01b3dd6694-auth-proxy-config\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184226 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184251 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9gbc\" (UniqueName: \"kubernetes.io/projected/ac4a870d-8cda-423b-a15b-391830c944f4-kube-api-access-k9gbc\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184305 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-config\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184326 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-client-ca\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184353 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-config\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184374 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-etcd-client\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184396 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184418 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-serving-cert\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184440 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/878b625f-d8df-457f-b208-f4bf5807a8d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184475 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9dadb6a-e49e-4473-8338-3af567aacb4a-audit-dir\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184498 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4a870d-8cda-423b-a15b-391830c944f4-metrics-tls\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184526 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq5bh\" (UniqueName: \"kubernetes.io/projected/98d3eede-8852-4bf5-a905-25974e47445f-kube-api-access-pq5bh\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184550 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnwlg\" (UniqueName: \"kubernetes.io/projected/6497cf4e-c461-4db9-88e4-5de2a5f28404-kube-api-access-qnwlg\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-serving-cert\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184599 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrg6p\" (UniqueName: \"kubernetes.io/projected/a24423db-53f2-4555-81e4-228b3911e144-kube-api-access-xrg6p\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184619 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184666 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb172836-9833-43d5-a99b-cc01b3dd6694-config\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184692 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnbwb\" (UniqueName: \"kubernetes.io/projected/c7332c18-9748-49d2-b512-a46c2d1fcb79-kube-api-access-tnbwb\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184717 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw4cs\" (UniqueName: \"kubernetes.io/projected/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-kube-api-access-mw4cs\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184739 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/878b625f-d8df-457f-b208-f4bf5807a8d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184761 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-oauth-config\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184783 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2sq5\" (UniqueName: \"kubernetes.io/projected/13b9671c-f825-49de-913c-42e8d161f7f8-kube-api-access-r2sq5\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184812 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-client\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184833 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77s24\" (UniqueName: \"kubernetes.io/projected/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-kube-api-access-77s24\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184856 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7332c18-9748-49d2-b512-a46c2d1fcb79-serving-cert\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184878 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184898 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184918 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6497cf4e-c461-4db9-88e4-5de2a5f28404-tmpfs\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184941 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pgmz\" (UniqueName: \"kubernetes.io/projected/010c6a41-8e2d-4391-ac1b-82814dad98a4-kube-api-access-9pgmz\") pod \"control-plane-machine-set-operator-78cbb6b69f-8d9mv\" (UID: \"010c6a41-8e2d-4391-ac1b-82814dad98a4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184971 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-etcd-serving-ca\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185000 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-key\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185048 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djzfc\" (UniqueName: \"kubernetes.io/projected/3f3571fd-ce1b-4105-9100-020fd1cd5076-kube-api-access-djzfc\") pod \"cluster-samples-operator-665b6dd947-tnl79\" (UID: \"3f3571fd-ce1b-4105-9100-020fd1cd5076\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185073 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-serving-cert\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185101 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb172836-9833-43d5-a99b-cc01b3dd6694-machine-approver-tls\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182758 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d9ce11ed-3022-47e0-8150-8af94af65076-node-pullsecrets\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185124 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185189 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9dadb6a-e49e-4473-8338-3af567aacb4a-audit-dir\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185225 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-client-ca\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185265 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-images\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185298 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7llv\" (UniqueName: \"kubernetes.io/projected/ae09caff-6233-41f8-bb7d-a2314363e2fa-kube-api-access-m7llv\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185330 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185348 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-webhook-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185364 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac4a870d-8cda-423b-a15b-391830c944f4-trusted-ca\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185382 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-ca\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185401 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185421 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24423db-53f2-4555-81e4-228b3911e144-serving-cert\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185439 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjdbq\" (UniqueName: \"kubernetes.io/projected/76705148-274c-4428-9508-13fe1193646e-kube-api-access-xjdbq\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185456 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac4a870d-8cda-423b-a15b-391830c944f4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185475 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f3571fd-ce1b-4105-9100-020fd1cd5076-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tnl79\" (UID: \"3f3571fd-ce1b-4105-9100-020fd1cd5076\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185500 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185520 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185542 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14579b3e-131e-4e98-b060-a93d2581479c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185561 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-srv-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185596 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-service-ca\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-etcd-client\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185647 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-image-import-ca\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185664 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185684 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-oauth-serving-cert\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185704 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-serving-cert\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185714 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-config\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185970 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb172836-9833-43d5-a99b-cc01b3dd6694-auth-proxy-config\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186028 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-service-ca\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186054 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-config\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186074 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186085 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186096 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-audit\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186122 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d9ce11ed-3022-47e0-8150-8af94af65076-audit-dir\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f8zb\" (UniqueName: \"kubernetes.io/projected/b9dadb6a-e49e-4473-8338-3af567aacb4a-kube-api-access-9f8zb\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186164 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13b9671c-f825-49de-913c-42e8d161f7f8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186184 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-console-config\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186199 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-config\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186214 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-trusted-ca\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186231 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66pts\" (UniqueName: \"kubernetes.io/projected/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-kube-api-access-66pts\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186288 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb172836-9833-43d5-a99b-cc01b3dd6694-config\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186308 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13b9671c-f825-49de-913c-42e8d161f7f8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186327 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-apiservice-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186349 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-encryption-config\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186367 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-images\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186381 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-cabundle\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186934 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-trusted-ca-bundle\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.187332 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-ca\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.187843 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-service-ca\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185519 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-config\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.187933 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.188050 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d9ce11ed-3022-47e0-8150-8af94af65076-audit-dir\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.188705 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-etcd-serving-ca\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.189272 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.189458 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-client-ca\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.189841 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-config\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.190008 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-oauth-serving-cert\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.190150 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-service-ca\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.191899 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.192026 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.192116 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-config\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.192223 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-client-ca\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.192245 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vtsxv"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.192485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-audit\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.192500 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-console-config\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.194803 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-574r8"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.195395 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.195494 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-client\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.195536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-serving-cert\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.195571 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.195970 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24423db-53f2-4555-81e4-228b3911e144-serving-cert\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.196270 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-serving-cert\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.197075 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-etcd-client\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.197262 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-oauth-config\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.197379 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-serving-cert\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.197551 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.197570 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb172836-9833-43d5-a99b-cc01b3dd6694-machine-approver-tls\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.198231 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7332c18-9748-49d2-b512-a46c2d1fcb79-serving-cert\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.198778 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-encryption-config\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.200774 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-image-import-ca\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.200860 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.200894 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.201939 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.203378 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-29z97"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.204757 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.205687 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-serving-cert\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.206062 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-etcd-client\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.206876 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-encryption-config\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.216374 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qjwrj"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.223344 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.226888 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29535840-t9tlz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.239152 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.242096 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.244934 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fdkct"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.246968 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zhrk"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.249813 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ktjdc"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.251280 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.259948 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-f4jxd"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.260056 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.261574 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.263927 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgpv7"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.264997 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.266097 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9z8qr"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.267429 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2zw27"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.268563 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535848-ccctv"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.269749 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.271129 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.272580 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.274766 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.274843 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.276212 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.276837 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.279240 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.280491 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.282549 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6rw4v"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.286983 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287396 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-images\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-cabundle\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287452 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14579b3e-131e-4e98-b060-a93d2581479c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287470 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz5ss\" (UniqueName: \"kubernetes.io/projected/14579b3e-131e-4e98-b060-a93d2581479c-kube-api-access-cz5ss\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287487 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkqct\" (UniqueName: \"kubernetes.io/projected/44e0d81c-a6e7-4e95-9901-ea32b8476755-kube-api-access-dkqct\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287506 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/010c6a41-8e2d-4391-ac1b-82814dad98a4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8d9mv\" (UID: \"010c6a41-8e2d-4391-ac1b-82814dad98a4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287530 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/878b625f-d8df-457f-b208-f4bf5807a8d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287553 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98d3eede-8852-4bf5-a905-25974e47445f-proxy-tls\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287570 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58vdh\" (UniqueName: \"kubernetes.io/projected/878b625f-d8df-457f-b208-f4bf5807a8d8-kube-api-access-58vdh\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287592 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzxgj\" (UniqueName: \"kubernetes.io/projected/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-kube-api-access-pzxgj\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287610 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287638 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9gbc\" (UniqueName: \"kubernetes.io/projected/ac4a870d-8cda-423b-a15b-391830c944f4-kube-api-access-k9gbc\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287670 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-serving-cert\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287684 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/878b625f-d8df-457f-b208-f4bf5807a8d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287704 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287729 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4a870d-8cda-423b-a15b-391830c944f4-metrics-tls\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287746 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq5bh\" (UniqueName: \"kubernetes.io/projected/98d3eede-8852-4bf5-a905-25974e47445f-kube-api-access-pq5bh\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287764 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnwlg\" (UniqueName: \"kubernetes.io/projected/6497cf4e-c461-4db9-88e4-5de2a5f28404-kube-api-access-qnwlg\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287784 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw4cs\" (UniqueName: \"kubernetes.io/projected/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-kube-api-access-mw4cs\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287817 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/878b625f-d8df-457f-b208-f4bf5807a8d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287842 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2sq5\" (UniqueName: \"kubernetes.io/projected/13b9671c-f825-49de-913c-42e8d161f7f8-kube-api-access-r2sq5\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287870 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287891 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287915 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6497cf4e-c461-4db9-88e4-5de2a5f28404-tmpfs\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287936 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pgmz\" (UniqueName: \"kubernetes.io/projected/010c6a41-8e2d-4391-ac1b-82814dad98a4-kube-api-access-9pgmz\") pod \"control-plane-machine-set-operator-78cbb6b69f-8d9mv\" (UID: \"010c6a41-8e2d-4391-ac1b-82814dad98a4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287962 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-key\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287978 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djzfc\" (UniqueName: \"kubernetes.io/projected/3f3571fd-ce1b-4105-9100-020fd1cd5076-kube-api-access-djzfc\") pod \"cluster-samples-operator-665b6dd947-tnl79\" (UID: \"3f3571fd-ce1b-4105-9100-020fd1cd5076\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287997 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-images\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288013 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7llv\" (UniqueName: \"kubernetes.io/projected/ae09caff-6233-41f8-bb7d-a2314363e2fa-kube-api-access-m7llv\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288030 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac4a870d-8cda-423b-a15b-391830c944f4-trusted-ca\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288047 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-webhook-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288064 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f3571fd-ce1b-4105-9100-020fd1cd5076-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tnl79\" (UID: \"3f3571fd-ce1b-4105-9100-020fd1cd5076\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288088 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac4a870d-8cda-423b-a15b-391830c944f4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288112 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288134 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288151 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14579b3e-131e-4e98-b060-a93d2581479c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288167 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-srv-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288191 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-config\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288207 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288228 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13b9671c-f825-49de-913c-42e8d161f7f8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288243 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-trusted-ca\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288259 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66pts\" (UniqueName: \"kubernetes.io/projected/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-kube-api-access-66pts\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288276 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-config\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288291 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13b9671c-f825-49de-913c-42e8d161f7f8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288307 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-apiservice-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288598 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14579b3e-131e-4e98-b060-a93d2581479c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288622 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-images\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288792 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/878b625f-d8df-457f-b208-f4bf5807a8d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288973 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tw95c"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.289505 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6497cf4e-c461-4db9-88e4-5de2a5f28404-tmpfs\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.289515 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.289793 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-574r8"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.291055 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-config\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.291451 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-serving-cert\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.291445 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-trusted-ca\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.291504 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-sl77b"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.292081 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14579b3e-131e-4e98-b060-a93d2581479c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.292415 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.292428 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-config\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.292460 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f3571fd-ce1b-4105-9100-020fd1cd5076-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tnl79\" (UID: \"3f3571fd-ce1b-4105-9100-020fd1cd5076\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.292646 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-k8qh8"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.293181 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/878b625f-d8df-457f-b208-f4bf5807a8d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.293391 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.294042 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.295241 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.295521 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.297843 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kxcrw"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.299282 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cr2bb"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.300607 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k8qh8"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.301704 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.302783 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d9gmh"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.303912 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.304929 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wdgtd"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.306321 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.306465 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wdgtd"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.308523 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.308532 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.308723 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.308619 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.315410 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.337190 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.355193 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.375072 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.395762 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.415416 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.435854 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.454825 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.476769 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.494825 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.515977 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.536168 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.555900 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.575871 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.595680 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.615773 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.635514 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.656187 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.676059 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.684539 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4a870d-8cda-423b-a15b-391830c944f4-metrics-tls\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.695158 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.723071 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.730972 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac4a870d-8cda-423b-a15b-391830c944f4-trusted-ca\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.735204 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.754902 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.775858 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.794897 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.815095 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.855886 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.875096 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.885550 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/010c6a41-8e2d-4391-ac1b-82814dad98a4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8d9mv\" (UID: \"010c6a41-8e2d-4391-ac1b-82814dad98a4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.895415 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.916059 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.935577 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.955829 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.964319 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.976040 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.006777 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.011969 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.015847 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.036432 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.057258 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.075174 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.096368 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.099759 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.115991 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.136095 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.144401 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13b9671c-f825-49de-913c-42e8d161f7f8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.155565 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.160785 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13b9671c-f825-49de-913c-42e8d161f7f8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.174664 4781 request.go:700] Waited for 1.006972722s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-operator-dockercfg-2bh8d&limit=500&resourceVersion=0 Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.177127 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.196812 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.216353 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.236013 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.255408 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.279250 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.284322 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-srv-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.287872 4781 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.288090 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-profile-collector-cert podName:ae09caff-6233-41f8-bb7d-a2314363e2fa nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.788066588 +0000 UTC m=+213.045606192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-profile-collector-cert") pod "olm-operator-6b444d44fb-mmx87" (UID: "ae09caff-6233-41f8-bb7d-a2314363e2fa") : failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.287882 4781 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.288367 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d3eede-8852-4bf5-a905-25974e47445f-proxy-tls podName:98d3eede-8852-4bf5-a905-25974e47445f nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.788349864 +0000 UTC m=+213.045889498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/98d3eede-8852-4bf5-a905-25974e47445f-proxy-tls") pod "machine-config-operator-74547568cd-rhhqx" (UID: "98d3eede-8852-4bf5-a905-25974e47445f") : failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.287961 4781 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.288740 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-cabundle podName:44e0d81c-a6e7-4e95-9901-ea32b8476755 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.788724143 +0000 UTC m=+213.046263777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-cabundle") pod "service-ca-9c57cc56f-kxcrw" (UID: "44e0d81c-a6e7-4e95-9901-ea32b8476755") : failed to sync configmap cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.288778 4781 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.289017 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-apiservice-cert podName:6497cf4e-c461-4db9-88e4-5de2a5f28404 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.789001669 +0000 UTC m=+213.046541313 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-apiservice-cert") pod "packageserver-d55dfcdfc-sw7s5" (UID: "6497cf4e-c461-4db9-88e4-5de2a5f28404") : failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.288905 4781 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.289284 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-images podName:98d3eede-8852-4bf5-a905-25974e47445f nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.789269435 +0000 UTC m=+213.046809079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-images") pod "machine-config-operator-74547568cd-rhhqx" (UID: "98d3eede-8852-4bf5-a905-25974e47445f") : failed to sync configmap cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.288918 4781 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.289535 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-webhook-cert podName:6497cf4e-c461-4db9-88e4-5de2a5f28404 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.789518601 +0000 UTC m=+213.047058235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-webhook-cert") pod "packageserver-d55dfcdfc-sw7s5" (UID: "6497cf4e-c461-4db9-88e4-5de2a5f28404") : failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.289978 4781 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.290054 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-key podName:44e0d81c-a6e7-4e95-9901-ea32b8476755 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.790041103 +0000 UTC m=+213.047580727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-key") pod "service-ca-9c57cc56f-kxcrw" (UID: "44e0d81c-a6e7-4e95-9901-ea32b8476755") : failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.290241 4781 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.290406 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-serving-cert podName:8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.790389511 +0000 UTC m=+213.047929135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-6w28d" (UID: "8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237") : failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.295217 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.314666 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.334892 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.354965 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.375287 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.395102 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.415044 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.435152 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.454788 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.475444 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.494959 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.514802 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.536061 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.555403 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.575319 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.595328 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.615151 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.635744 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.654514 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.695056 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.714994 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.736481 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.780766 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grq2j\" (UniqueName: \"kubernetes.io/projected/d9ce11ed-3022-47e0-8150-8af94af65076-kube-api-access-grq2j\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.800148 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf6dv\" (UniqueName: \"kubernetes.io/projected/cb172836-9833-43d5-a99b-cc01b3dd6694-kube-api-access-tf6dv\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-webhook-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806526 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-apiservice-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806542 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-cabundle\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98d3eede-8852-4bf5-a905-25974e47445f-proxy-tls\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806600 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806739 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-key\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806785 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-images\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.807240 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-images\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.808160 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-cabundle\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.811279 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-apiservice-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.811834 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-key\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.811903 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.812641 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.812960 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98d3eede-8852-4bf5-a905-25974e47445f-proxy-tls\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.819851 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77s24\" (UniqueName: \"kubernetes.io/projected/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-kube-api-access-77s24\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.834141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrg6p\" (UniqueName: \"kubernetes.io/projected/a24423db-53f2-4555-81e4-228b3911e144-kube-api-access-xrg6p\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.842794 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-webhook-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.849292 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnbwb\" (UniqueName: \"kubernetes.io/projected/c7332c18-9748-49d2-b512-a46c2d1fcb79-kube-api-access-tnbwb\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.855922 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.873660 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.875243 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.896047 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.916259 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.935525 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.968360 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.983330 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.984960 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f8zb\" (UniqueName: \"kubernetes.io/projected/b9dadb6a-e49e-4473-8338-3af567aacb4a-kube-api-access-9f8zb\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.996954 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.036917 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.037180 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.038126 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjdbq\" (UniqueName: \"kubernetes.io/projected/76705148-274c-4428-9508-13fe1193646e-kube-api-access-xjdbq\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.057359 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.072984 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.087923 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.089006 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.107489 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkqct\" (UniqueName: \"kubernetes.io/projected/44e0d81c-a6e7-4e95-9901-ea32b8476755-kube-api-access-dkqct\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.127326 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58vdh\" (UniqueName: \"kubernetes.io/projected/878b625f-d8df-457f-b208-f4bf5807a8d8-kube-api-access-58vdh\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.132241 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzxgj\" (UniqueName: \"kubernetes.io/projected/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-kube-api-access-pzxgj\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.154873 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9gbc\" (UniqueName: \"kubernetes.io/projected/ac4a870d-8cda-423b-a15b-391830c944f4-kube-api-access-k9gbc\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.174648 4781 request.go:700] Waited for 1.886102816s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.174880 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ktjdc"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.177799 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw4cs\" (UniqueName: \"kubernetes.io/projected/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-kube-api-access-mw4cs\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.192228 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/878b625f-d8df-457f-b208-f4bf5807a8d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:04 crc kubenswrapper[4781]: W0227 00:09:04.193410 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7332c18_9748_49d2_b512_a46c2d1fcb79.slice/crio-569469b156a3d6f73fda1c00c629b8cfcf29a4662b4eccaa3dcb213bb4a0f1d1 WatchSource:0}: Error finding container 569469b156a3d6f73fda1c00c629b8cfcf29a4662b4eccaa3dcb213bb4a0f1d1: Status 404 returned error can't find the container with id 569469b156a3d6f73fda1c00c629b8cfcf29a4662b4eccaa3dcb213bb4a0f1d1 Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.207747 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.209237 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.210210 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2sq5\" (UniqueName: \"kubernetes.io/projected/13b9671c-f825-49de-913c-42e8d161f7f8-kube-api-access-r2sq5\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.227918 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.231271 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz5ss\" (UniqueName: \"kubernetes.io/projected/14579b3e-131e-4e98-b060-a93d2581479c-kube-api-access-cz5ss\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.249273 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7llv\" (UniqueName: \"kubernetes.io/projected/ae09caff-6233-41f8-bb7d-a2314363e2fa-kube-api-access-m7llv\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.267875 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.273502 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pgmz\" (UniqueName: \"kubernetes.io/projected/010c6a41-8e2d-4391-ac1b-82814dad98a4-kube-api-access-9pgmz\") pod \"control-plane-machine-set-operator-78cbb6b69f-8d9mv\" (UID: \"010c6a41-8e2d-4391-ac1b-82814dad98a4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.293563 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cr2bb"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.293811 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djzfc\" (UniqueName: \"kubernetes.io/projected/3f3571fd-ce1b-4105-9100-020fd1cd5076-kube-api-access-djzfc\") pod \"cluster-samples-operator-665b6dd947-tnl79\" (UID: \"3f3571fd-ce1b-4105-9100-020fd1cd5076\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.303325 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.308043 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac4a870d-8cda-423b-a15b-391830c944f4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.330139 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.332117 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.349774 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq5bh\" (UniqueName: \"kubernetes.io/projected/98d3eede-8852-4bf5-a905-25974e47445f-kube-api-access-pq5bh\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.367937 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.374751 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnwlg\" (UniqueName: \"kubernetes.io/projected/6497cf4e-c461-4db9-88e4-5de2a5f28404-kube-api-access-qnwlg\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.395658 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66pts\" (UniqueName: \"kubernetes.io/projected/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-kube-api-access-66pts\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.399678 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.413152 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-f4jxd"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.416139 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.416168 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.430660 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.435866 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.458298 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.475370 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.481258 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.495068 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.495819 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.502321 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:04 crc kubenswrapper[4781]: W0227 00:09:04.508231 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda24423db_53f2_4555_81e4_228b3911e144.slice/crio-e2d52d381f5f2c2aa6e3b3529d449b9cd90d4bab2b2b6374496041f06b7f95d6 WatchSource:0}: Error finding container e2d52d381f5f2c2aa6e3b3529d449b9cd90d4bab2b2b6374496041f06b7f95d6: Status 404 returned error can't find the container with id e2d52d381f5f2c2aa6e3b3529d449b9cd90d4bab2b2b6374496041f06b7f95d6 Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.515114 4781 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.537475 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.539429 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.550155 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.553304 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.557374 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.560574 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.577860 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.596119 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.623533 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.636580 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.644803 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.658392 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kxcrw"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.658403 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.678893 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.691590 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds"] Feb 27 00:09:04 crc kubenswrapper[4781]: W0227 00:09:04.707501 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44e0d81c_a6e7_4e95_9901_ea32b8476755.slice/crio-a862c59333a3fae30dcb2d5fe0c1a79ccdc0f501b86262bc96c6db90988cbb9d WatchSource:0}: Error finding container a862c59333a3fae30dcb2d5fe0c1a79ccdc0f501b86262bc96c6db90988cbb9d: Status 404 returned error can't find the container with id a862c59333a3fae30dcb2d5fe0c1a79ccdc0f501b86262bc96c6db90988cbb9d Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.714063 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-29z97"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.727408 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgpv7"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.747937 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748212 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748232 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-856kl\" (UniqueName: \"kubernetes.io/projected/91e2c481-01ee-461f-bc5b-d09b7ea221c5-kube-api-access-856kl\") pod \"image-pruner-29535840-t9tlz\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748249 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwd7v\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-kube-api-access-fwd7v\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748265 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748280 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748295 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx57q\" (UniqueName: \"kubernetes.io/projected/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-kube-api-access-hx57q\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748311 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748326 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-default-certificate\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748342 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4ncb\" (UniqueName: \"kubernetes.io/projected/d5b604c3-aa52-42f3-8922-8edee056f016-kube-api-access-g4ncb\") pod \"dns-operator-744455d44c-d9gmh\" (UID: \"d5b604c3-aa52-42f3-8922-8edee056f016\") " pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748358 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748384 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748402 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsc87\" (UniqueName: \"kubernetes.io/projected/7822bd5e-93d1-4f1e-961c-ec0c8a04ab59-kube-api-access-gsc87\") pod \"multus-admission-controller-857f4d67dd-6rw4v\" (UID: \"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748428 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-registry-certificates\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748446 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748462 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k94m7\" (UniqueName: \"kubernetes.io/projected/6846d54c-4d22-46c7-b017-947a3986d773-kube-api-access-k94m7\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748499 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-config\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748517 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czjn5\" (UniqueName: \"kubernetes.io/projected/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-kube-api-access-czjn5\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748540 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748554 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/678f27fc-d210-4a4f-bd73-090378740da9-config-volume\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748573 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfnpb\" (UniqueName: \"kubernetes.io/projected/a75bfacf-8cf7-4560-8b4a-6e876daa4c8c-kube-api-access-tfnpb\") pod \"downloads-7954f5f757-qjwrj\" (UID: \"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c\") " pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748590 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-registry-tls\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748612 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748660 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748677 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr2kq\" (UniqueName: \"kubernetes.io/projected/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-kube-api-access-kr2kq\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748691 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-srv-cert\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748707 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-dir\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748723 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-service-ca-bundle\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748749 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf32f77b-92ad-479d-8ee3-423f16089eb6-serving-cert\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748784 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d51c244-aac1-41de-adc4-2393a45392f1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748799 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/678f27fc-d210-4a4f-bd73-090378740da9-secret-volume\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748815 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748833 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnhbf\" (UniqueName: \"kubernetes.io/projected/14db9d97-7da5-43c2-8d48-fb435f1a19d0-kube-api-access-vnhbf\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748865 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e1c9b213-8c36-4ecf-831f-69a912f6364f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748888 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jldnh\" (UniqueName: \"kubernetes.io/projected/5cbee45f-1bdf-44e9-9782-83340ea69870-kube-api-access-jldnh\") pod \"package-server-manager-789f6589d5-8mth6\" (UID: \"5cbee45f-1bdf-44e9-9782-83340ea69870\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748910 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/14db9d97-7da5-43c2-8d48-fb435f1a19d0-proxy-tls\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748927 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cbee45f-1bdf-44e9-9782-83340ea69870-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8mth6\" (UID: \"5cbee45f-1bdf-44e9-9782-83340ea69870\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748943 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-profile-collector-cert\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748959 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748976 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c9b213-8c36-4ecf-831f-69a912f6364f-serving-cert\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748992 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7822bd5e-93d1-4f1e-961c-ec0c8a04ab59-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6rw4v\" (UID: \"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749019 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsq8c\" (UniqueName: \"kubernetes.io/projected/e1c9b213-8c36-4ecf-831f-69a912f6364f-kube-api-access-hsq8c\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749044 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d51c244-aac1-41de-adc4-2393a45392f1-config\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749059 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91e2c481-01ee-461f-bc5b-d09b7ea221c5-serviceca\") pod \"image-pruner-29535840-t9tlz\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749076 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-bound-sa-token\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749091 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-stats-auth\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749114 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749130 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-policies\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749154 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14db9d97-7da5-43c2-8d48-fb435f1a19d0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749169 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9gg9\" (UniqueName: \"kubernetes.io/projected/396c6e41-89e8-4ecf-ac96-f73aad1f4bbb-kube-api-access-l9gg9\") pod \"migrator-59844c95c7-2xvkz\" (UID: \"396c6e41-89e8-4ecf-ac96-f73aad1f4bbb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749186 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-trusted-ca\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749202 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95hgc\" (UniqueName: \"kubernetes.io/projected/678f27fc-d210-4a4f-bd73-090378740da9-kube-api-access-95hgc\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749219 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749233 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-config\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-metrics-certs\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749273 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6846d54c-4d22-46c7-b017-947a3986d773-service-ca-bundle\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749298 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749319 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5b604c3-aa52-42f3-8922-8edee056f016-metrics-tls\") pod \"dns-operator-744455d44c-d9gmh\" (UID: \"d5b604c3-aa52-42f3-8922-8edee056f016\") " pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749335 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pqfm\" (UniqueName: \"kubernetes.io/projected/bf32f77b-92ad-479d-8ee3-423f16089eb6-kube-api-access-9pqfm\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749349 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16339491-baee-42b5-82bb-07bca82a5f77-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749376 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749392 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d51c244-aac1-41de-adc4-2393a45392f1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749408 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16339491-baee-42b5-82bb-07bca82a5f77-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: E0227 00:09:04.752329 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.252314125 +0000 UTC m=+214.509853669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.775585 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.781054 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vtsxv"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.782022 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.783147 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" event={"ID":"14579b3e-131e-4e98-b060-a93d2581479c","Type":"ContainerStarted","Data":"cb59e805734117202d19664eb43966ce8e0467aee64b770dd770da346fa9a444"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.784303 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" event={"ID":"3ba2e306-8f79-4e15-8529-f3a16a0fa95f","Type":"ContainerStarted","Data":"1c300badcad0a7b1d1f35986cff4462b758b04b4b3586b86436a5dec1d1bdfe1"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.785465 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" event={"ID":"c7332c18-9748-49d2-b512-a46c2d1fcb79","Type":"ContainerStarted","Data":"7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.785500 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" event={"ID":"c7332c18-9748-49d2-b512-a46c2d1fcb79","Type":"ContainerStarted","Data":"569469b156a3d6f73fda1c00c629b8cfcf29a4662b4eccaa3dcb213bb4a0f1d1"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.785738 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.788216 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" event={"ID":"a24423db-53f2-4555-81e4-228b3911e144","Type":"ContainerStarted","Data":"e2d52d381f5f2c2aa6e3b3529d449b9cd90d4bab2b2b6374496041f06b7f95d6"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.789110 4781 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ktjdc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.789138 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.792582 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" event={"ID":"b9dadb6a-e49e-4473-8338-3af567aacb4a","Type":"ContainerStarted","Data":"e9d5a1980724d143f2a7fb6b4bfe55b32f38b196ca38766a71fb45630ec5a5f0"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.795395 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" event={"ID":"cb172836-9833-43d5-a99b-cc01b3dd6694","Type":"ContainerStarted","Data":"012340f634683fb6a06950de9235a76463299205e92e22dd7302613228455891"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.795417 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" event={"ID":"cb172836-9833-43d5-a99b-cc01b3dd6694","Type":"ContainerStarted","Data":"154a034e255de1d0282d96277f75f0f98ed53321238e0ad1ca74e3a78b581c32"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.796470 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" event={"ID":"44e0d81c-a6e7-4e95-9901-ea32b8476755","Type":"ContainerStarted","Data":"a862c59333a3fae30dcb2d5fe0c1a79ccdc0f501b86262bc96c6db90988cbb9d"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.799236 4781 generic.go:334] "Generic (PLEG): container finished" podID="d9ce11ed-3022-47e0-8150-8af94af65076" containerID="af1ff5a8ff8c84fc8b96f4c32a6a04d56e630d32b8fbaa75d297c098223eb3db" exitCode=0 Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.799280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" event={"ID":"d9ce11ed-3022-47e0-8150-8af94af65076","Type":"ContainerDied","Data":"af1ff5a8ff8c84fc8b96f4c32a6a04d56e630d32b8fbaa75d297c098223eb3db"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.799306 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" event={"ID":"d9ce11ed-3022-47e0-8150-8af94af65076","Type":"ContainerStarted","Data":"2ecfb860b15c367b713db5be275357775650915a6aaa9a58138869228dd57b2c"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.846799 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850500 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:04 crc kubenswrapper[4781]: E0227 00:09:04.850697 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.350671147 +0000 UTC m=+214.608210701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850734 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-config\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850760 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czjn5\" (UniqueName: \"kubernetes.io/projected/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-kube-api-access-czjn5\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850800 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-plugins-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850818 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850871 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/678f27fc-d210-4a4f-bd73-090378740da9-config-volume\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850888 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-socket-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850924 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfnpb\" (UniqueName: \"kubernetes.io/projected/a75bfacf-8cf7-4560-8b4a-6e876daa4c8c-kube-api-access-tfnpb\") pod \"downloads-7954f5f757-qjwrj\" (UID: \"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c\") " pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850943 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f4a859-d834-408d-9a9c-4d293b47d95a-config\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850959 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-registry-tls\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850979 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851029 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2r6k\" (UniqueName: \"kubernetes.io/projected/16f4a859-d834-408d-9a9c-4d293b47d95a-kube-api-access-f2r6k\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851064 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc7gw\" (UniqueName: \"kubernetes.io/projected/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-kube-api-access-dc7gw\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851079 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f4a859-d834-408d-9a9c-4d293b47d95a-serving-cert\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851136 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851153 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-registration-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851168 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr2kq\" (UniqueName: \"kubernetes.io/projected/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-kube-api-access-kr2kq\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851185 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-srv-cert\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851211 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-dir\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851228 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/38d33d08-97ce-49cb-b200-8ee30fc09e77-certs\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851251 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-service-ca-bundle\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851266 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k8gs\" (UniqueName: \"kubernetes.io/projected/38d33d08-97ce-49cb-b200-8ee30fc09e77-kube-api-access-4k8gs\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851291 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf32f77b-92ad-479d-8ee3-423f16089eb6-serving-cert\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851308 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d51c244-aac1-41de-adc4-2393a45392f1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851327 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/678f27fc-d210-4a4f-bd73-090378740da9-secret-volume\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851358 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851379 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnhbf\" (UniqueName: \"kubernetes.io/projected/14db9d97-7da5-43c2-8d48-fb435f1a19d0-kube-api-access-vnhbf\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851396 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-mountpoint-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851419 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e1c9b213-8c36-4ecf-831f-69a912f6364f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jldnh\" (UniqueName: \"kubernetes.io/projected/5cbee45f-1bdf-44e9-9782-83340ea69870-kube-api-access-jldnh\") pod \"package-server-manager-789f6589d5-8mth6\" (UID: \"5cbee45f-1bdf-44e9-9782-83340ea69870\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851475 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/14db9d97-7da5-43c2-8d48-fb435f1a19d0-proxy-tls\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851510 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cbee45f-1bdf-44e9-9782-83340ea69870-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8mth6\" (UID: \"5cbee45f-1bdf-44e9-9782-83340ea69870\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851528 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-profile-collector-cert\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-config\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851555 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851590 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7822bd5e-93d1-4f1e-961c-ec0c8a04ab59-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6rw4v\" (UID: \"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851611 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c9b213-8c36-4ecf-831f-69a912f6364f-serving-cert\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852678 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-config-volume\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852713 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsq8c\" (UniqueName: \"kubernetes.io/projected/e1c9b213-8c36-4ecf-831f-69a912f6364f-kube-api-access-hsq8c\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852758 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d51c244-aac1-41de-adc4-2393a45392f1-config\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852777 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91e2c481-01ee-461f-bc5b-d09b7ea221c5-serviceca\") pod \"image-pruner-29535840-t9tlz\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852814 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-bound-sa-token\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-stats-auth\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852915 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-policies\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852940 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-trusted-ca\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852959 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14db9d97-7da5-43c2-8d48-fb435f1a19d0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852976 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9gg9\" (UniqueName: \"kubernetes.io/projected/396c6e41-89e8-4ecf-ac96-f73aad1f4bbb-kube-api-access-l9gg9\") pod \"migrator-59844c95c7-2xvkz\" (UID: \"396c6e41-89e8-4ecf-ac96-f73aad1f4bbb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852992 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853009 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95hgc\" (UniqueName: \"kubernetes.io/projected/678f27fc-d210-4a4f-bd73-090378740da9-kube-api-access-95hgc\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853037 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-config\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853063 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-metrics-certs\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853093 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6846d54c-4d22-46c7-b017-947a3986d773-service-ca-bundle\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853111 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-metrics-tls\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853154 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853173 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/38d33d08-97ce-49cb-b200-8ee30fc09e77-node-bootstrap-token\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853211 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5b604c3-aa52-42f3-8922-8edee056f016-metrics-tls\") pod \"dns-operator-744455d44c-d9gmh\" (UID: \"d5b604c3-aa52-42f3-8922-8edee056f016\") " pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853248 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pqfm\" (UniqueName: \"kubernetes.io/projected/bf32f77b-92ad-479d-8ee3-423f16089eb6-kube-api-access-9pqfm\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853272 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-csi-data-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853298 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16339491-baee-42b5-82bb-07bca82a5f77-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853470 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv9hp\" (UniqueName: \"kubernetes.io/projected/df035290-8e3c-422b-90ac-573b592defcf-kube-api-access-mv9hp\") pod \"auto-csr-approver-29535848-ccctv\" (UID: \"df035290-8e3c-422b-90ac-573b592defcf\") " pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853518 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59cxk\" (UniqueName: \"kubernetes.io/projected/bd798400-ea88-4aad-ae19-815b6b8d57da-kube-api-access-59cxk\") pod \"ingress-canary-574r8\" (UID: \"bd798400-ea88-4aad-ae19-815b6b8d57da\") " pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853575 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853594 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d51c244-aac1-41de-adc4-2393a45392f1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853611 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsp7c\" (UniqueName: \"kubernetes.io/projected/26e75b38-be64-4f34-933f-731abfe217b6-kube-api-access-zsp7c\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853644 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd798400-ea88-4aad-ae19-815b6b8d57da-cert\") pod \"ingress-canary-574r8\" (UID: \"bd798400-ea88-4aad-ae19-815b6b8d57da\") " pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853681 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16339491-baee-42b5-82bb-07bca82a5f77-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853720 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853811 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwd7v\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-kube-api-access-fwd7v\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853832 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853850 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853867 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-856kl\" (UniqueName: \"kubernetes.io/projected/91e2c481-01ee-461f-bc5b-d09b7ea221c5-kube-api-access-856kl\") pod \"image-pruner-29535840-t9tlz\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853895 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853914 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx57q\" (UniqueName: \"kubernetes.io/projected/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-kube-api-access-hx57q\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853934 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853953 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-default-certificate\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854000 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4ncb\" (UniqueName: \"kubernetes.io/projected/d5b604c3-aa52-42f3-8922-8edee056f016-kube-api-access-g4ncb\") pod \"dns-operator-744455d44c-d9gmh\" (UID: \"d5b604c3-aa52-42f3-8922-8edee056f016\") " pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854020 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsc87\" (UniqueName: \"kubernetes.io/projected/7822bd5e-93d1-4f1e-961c-ec0c8a04ab59-kube-api-access-gsc87\") pod \"multus-admission-controller-857f4d67dd-6rw4v\" (UID: \"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854036 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854052 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854071 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-registry-certificates\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854087 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854107 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k94m7\" (UniqueName: \"kubernetes.io/projected/6846d54c-4d22-46c7-b017-947a3986d773-kube-api-access-k94m7\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: E0227 00:09:04.855374 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.355358274 +0000 UTC m=+214.612897828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.857250 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e1c9b213-8c36-4ecf-831f-69a912f6364f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.857798 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-registry-tls\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.858842 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-service-ca-bundle\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.861045 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-dir\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.863411 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-policies\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.864544 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14db9d97-7da5-43c2-8d48-fb435f1a19d0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853092 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/678f27fc-d210-4a4f-bd73-090378740da9-config-volume\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.867736 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d51c244-aac1-41de-adc4-2393a45392f1-config\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.868038 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91e2c481-01ee-461f-bc5b-d09b7ea221c5-serviceca\") pod \"image-pruner-29535840-t9tlz\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.868456 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/14db9d97-7da5-43c2-8d48-fb435f1a19d0-proxy-tls\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.868963 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-config\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.871222 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.871878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7822bd5e-93d1-4f1e-961c-ec0c8a04ab59-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6rw4v\" (UID: \"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.872021 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6846d54c-4d22-46c7-b017-947a3986d773-service-ca-bundle\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.872189 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-stats-auth\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.872385 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf32f77b-92ad-479d-8ee3-423f16089eb6-serving-cert\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.872401 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-trusted-ca\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.875542 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.875936 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cbee45f-1bdf-44e9-9782-83340ea69870-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8mth6\" (UID: \"5cbee45f-1bdf-44e9-9782-83340ea69870\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.876538 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-registry-certificates\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.878910 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.879407 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.880536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.882030 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16339491-baee-42b5-82bb-07bca82a5f77-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.888676 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-profile-collector-cert\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.888756 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/678f27fc-d210-4a4f-bd73-090378740da9-secret-volume\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.893501 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.893539 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.893776 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.893839 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.894167 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-srv-cert\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.894177 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c9b213-8c36-4ecf-831f-69a912f6364f-serving-cert\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.894248 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.894356 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5b604c3-aa52-42f3-8922-8edee056f016-metrics-tls\") pod \"dns-operator-744455d44c-d9gmh\" (UID: \"d5b604c3-aa52-42f3-8922-8edee056f016\") " pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.894533 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.895131 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czjn5\" (UniqueName: \"kubernetes.io/projected/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-kube-api-access-czjn5\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.895139 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.896235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.896466 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.896749 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.896807 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.897263 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d51c244-aac1-41de-adc4-2393a45392f1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.910035 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfnpb\" (UniqueName: \"kubernetes.io/projected/a75bfacf-8cf7-4560-8b4a-6e876daa4c8c-kube-api-access-tfnpb\") pod \"downloads-7954f5f757-qjwrj\" (UID: \"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c\") " pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.927837 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16339491-baee-42b5-82bb-07bca82a5f77-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.929972 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-metrics-certs\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.933918 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-default-certificate\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.953238 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955033 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955142 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-metrics-tls\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955162 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/38d33d08-97ce-49cb-b200-8ee30fc09e77-node-bootstrap-token\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955192 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-csi-data-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955210 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv9hp\" (UniqueName: \"kubernetes.io/projected/df035290-8e3c-422b-90ac-573b592defcf-kube-api-access-mv9hp\") pod \"auto-csr-approver-29535848-ccctv\" (UID: \"df035290-8e3c-422b-90ac-573b592defcf\") " pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955459 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59cxk\" (UniqueName: \"kubernetes.io/projected/bd798400-ea88-4aad-ae19-815b6b8d57da-kube-api-access-59cxk\") pod \"ingress-canary-574r8\" (UID: \"bd798400-ea88-4aad-ae19-815b6b8d57da\") " pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955477 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsp7c\" (UniqueName: \"kubernetes.io/projected/26e75b38-be64-4f34-933f-731abfe217b6-kube-api-access-zsp7c\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955492 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd798400-ea88-4aad-ae19-815b6b8d57da-cert\") pod \"ingress-canary-574r8\" (UID: \"bd798400-ea88-4aad-ae19-815b6b8d57da\") " pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955559 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-plugins-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-socket-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955591 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f4a859-d834-408d-9a9c-4d293b47d95a-config\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955617 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2r6k\" (UniqueName: \"kubernetes.io/projected/16f4a859-d834-408d-9a9c-4d293b47d95a-kube-api-access-f2r6k\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955740 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f4a859-d834-408d-9a9c-4d293b47d95a-serving-cert\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955792 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc7gw\" (UniqueName: \"kubernetes.io/projected/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-kube-api-access-dc7gw\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955805 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-csi-data-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955829 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-registration-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955853 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/38d33d08-97ce-49cb-b200-8ee30fc09e77-certs\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k8gs\" (UniqueName: \"kubernetes.io/projected/38d33d08-97ce-49cb-b200-8ee30fc09e77-kube-api-access-4k8gs\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:04 crc kubenswrapper[4781]: E0227 00:09:04.955915 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.455898276 +0000 UTC m=+214.713437830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955961 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-mountpoint-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.956010 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-config-volume\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.956595 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-config-volume\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.956781 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-mountpoint-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.956918 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-plugins-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.957064 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-socket-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.957419 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-registration-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.957995 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-metrics-tls\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.958149 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jldnh\" (UniqueName: \"kubernetes.io/projected/5cbee45f-1bdf-44e9-9782-83340ea69870-kube-api-access-jldnh\") pod \"package-server-manager-789f6589d5-8mth6\" (UID: \"5cbee45f-1bdf-44e9-9782-83340ea69870\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.967544 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd798400-ea88-4aad-ae19-815b6b8d57da-cert\") pod \"ingress-canary-574r8\" (UID: \"bd798400-ea88-4aad-ae19-815b6b8d57da\") " pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.972190 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsq8c\" (UniqueName: \"kubernetes.io/projected/e1c9b213-8c36-4ecf-831f-69a912f6364f-kube-api-access-hsq8c\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.978553 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.989019 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d51c244-aac1-41de-adc4-2393a45392f1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.008512 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.009467 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-bound-sa-token\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.026985 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f4a859-d834-408d-9a9c-4d293b47d95a-config\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.027039 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f4a859-d834-408d-9a9c-4d293b47d95a-serving-cert\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.027496 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.031765 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/38d33d08-97ce-49cb-b200-8ee30fc09e77-certs\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.031950 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9gg9\" (UniqueName: \"kubernetes.io/projected/396c6e41-89e8-4ecf-ac96-f73aad1f4bbb-kube-api-access-l9gg9\") pod \"migrator-59844c95c7-2xvkz\" (UID: \"396c6e41-89e8-4ecf-ac96-f73aad1f4bbb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.035320 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/38d33d08-97ce-49cb-b200-8ee30fc09e77-node-bootstrap-token\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.050951 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnhbf\" (UniqueName: \"kubernetes.io/projected/14db9d97-7da5-43c2-8d48-fb435f1a19d0-kube-api-access-vnhbf\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.058112 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.058514 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.558500695 +0000 UTC m=+214.816040249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.087694 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.090260 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr2kq\" (UniqueName: \"kubernetes.io/projected/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-kube-api-access-kr2kq\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.112967 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pqfm\" (UniqueName: \"kubernetes.io/projected/bf32f77b-92ad-479d-8ee3-423f16089eb6-kube-api-access-9pqfm\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.114914 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.128507 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwd7v\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-kube-api-access-fwd7v\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.131270 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.150243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4ncb\" (UniqueName: \"kubernetes.io/projected/d5b604c3-aa52-42f3-8922-8edee056f016-kube-api-access-g4ncb\") pod \"dns-operator-744455d44c-d9gmh\" (UID: \"d5b604c3-aa52-42f3-8922-8edee056f016\") " pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.158750 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95hgc\" (UniqueName: \"kubernetes.io/projected/678f27fc-d210-4a4f-bd73-090378740da9-kube-api-access-95hgc\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.158981 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.159302 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.659289193 +0000 UTC m=+214.916828747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.167281 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.175917 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsc87\" (UniqueName: \"kubernetes.io/projected/7822bd5e-93d1-4f1e-961c-ec0c8a04ab59-kube-api-access-gsc87\") pod \"multus-admission-controller-857f4d67dd-6rw4v\" (UID: \"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.181742 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.189833 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-856kl\" (UniqueName: \"kubernetes.io/projected/91e2c481-01ee-461f-bc5b-d09b7ea221c5-kube-api-access-856kl\") pod \"image-pruner-29535840-t9tlz\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.209438 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.210494 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.213241 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k94m7\" (UniqueName: \"kubernetes.io/projected/6846d54c-4d22-46c7-b017-947a3986d773-kube-api-access-k94m7\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.230255 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.231231 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx57q\" (UniqueName: \"kubernetes.io/projected/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-kube-api-access-hx57q\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:05 crc kubenswrapper[4781]: W0227 00:09:05.248016 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae09caff_6233_41f8_bb7d_a2314363e2fa.slice/crio-326df085f5c1d99acbb7fcfd76f06d7ee211aa1d505e95c60c62d639a1aeac5b WatchSource:0}: Error finding container 326df085f5c1d99acbb7fcfd76f06d7ee211aa1d505e95c60c62d639a1aeac5b: Status 404 returned error can't find the container with id 326df085f5c1d99acbb7fcfd76f06d7ee211aa1d505e95c60c62d639a1aeac5b Feb 27 00:09:05 crc kubenswrapper[4781]: W0227 00:09:05.249185 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98d3eede_8852_4bf5_a905_25974e47445f.slice/crio-0c726a221019f7d1c3295ad3f7aa546ff2c9eec6b98aeb53101e79567375dc69 WatchSource:0}: Error finding container 0c726a221019f7d1c3295ad3f7aa546ff2c9eec6b98aeb53101e79567375dc69: Status 404 returned error can't find the container with id 0c726a221019f7d1c3295ad3f7aa546ff2c9eec6b98aeb53101e79567375dc69 Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.257496 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.259907 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.260324 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.760312085 +0000 UTC m=+215.017851639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.271563 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsp7c\" (UniqueName: \"kubernetes.io/projected/26e75b38-be64-4f34-933f-731abfe217b6-kube-api-access-zsp7c\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.274389 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.282349 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.291746 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.298212 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2zw27"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.302443 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.315082 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.334114 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k8gs\" (UniqueName: \"kubernetes.io/projected/38d33d08-97ce-49cb-b200-8ee30fc09e77-kube-api-access-4k8gs\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.354170 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2r6k\" (UniqueName: \"kubernetes.io/projected/16f4a859-d834-408d-9a9c-4d293b47d95a-kube-api-access-f2r6k\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.355310 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv9hp\" (UniqueName: \"kubernetes.io/projected/df035290-8e3c-422b-90ac-573b592defcf-kube-api-access-mv9hp\") pod \"auto-csr-approver-29535848-ccctv\" (UID: \"df035290-8e3c-422b-90ac-573b592defcf\") " pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.362935 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.363507 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59cxk\" (UniqueName: \"kubernetes.io/projected/bd798400-ea88-4aad-ae19-815b6b8d57da-kube-api-access-59cxk\") pod \"ingress-canary-574r8\" (UID: \"bd798400-ea88-4aad-ae19-815b6b8d57da\") " pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.363938 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.364367 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.864351447 +0000 UTC m=+215.121891001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.369729 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc7gw\" (UniqueName: \"kubernetes.io/projected/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-kube-api-access-dc7gw\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.371652 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.448882 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.465799 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.466409 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.966396094 +0000 UTC m=+215.223935648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.478940 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qjwrj"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.514726 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.544780 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.548306 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.556942 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.566404 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.567076 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.567862 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.067834656 +0000 UTC m=+215.325374210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.577788 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.586587 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.596952 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.663497 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.669210 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.669664 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.169652247 +0000 UTC m=+215.427191801 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.745643 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.774480 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.774649 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.27460828 +0000 UTC m=+215.532147834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.775072 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.775426 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.275415269 +0000 UTC m=+215.532954823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: W0227 00:09:05.796887 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b5292e2_0434_46c6_ba9e_33622d4d5cbf.slice/crio-c0acf06a20c749cc1ae544f4c829253a7f8c4162d5a1e577ded11330345f90c7 WatchSource:0}: Error finding container c0acf06a20c749cc1ae544f4c829253a7f8c4162d5a1e577ded11330345f90c7: Status 404 returned error can't find the container with id c0acf06a20c749cc1ae544f4c829253a7f8c4162d5a1e577ded11330345f90c7 Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.810554 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" podStartSLOduration=159.810533676 podStartE2EDuration="2m39.810533676s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:05.806382231 +0000 UTC m=+215.063921805" watchObservedRunningTime="2026-02-27 00:09:05.810533676 +0000 UTC m=+215.068073230" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.816978 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.851837 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" event={"ID":"a24423db-53f2-4555-81e4-228b3911e144","Type":"ContainerStarted","Data":"1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.851902 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.857891 4781 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-swgz7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.857947 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" podUID="a24423db-53f2-4555-81e4-228b3911e144" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.863037 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" event={"ID":"5b5292e2-0434-46c6-ba9e-33622d4d5cbf","Type":"ContainerStarted","Data":"c0acf06a20c749cc1ae544f4c829253a7f8c4162d5a1e577ded11330345f90c7"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.867742 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" event={"ID":"878b625f-d8df-457f-b208-f4bf5807a8d8","Type":"ContainerStarted","Data":"49b4a5b9ea767c3f3ded04253c9c548546297c1f4f105cc5c68d051e2afca9c6"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.867787 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" event={"ID":"878b625f-d8df-457f-b208-f4bf5807a8d8","Type":"ContainerStarted","Data":"4cdcfb6cdc8e565b7f9e1f0ea51788665f8bd011272aeedcaa6a097c9b7c5026"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.874499 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vtsxv" event={"ID":"76705148-274c-4428-9508-13fe1193646e","Type":"ContainerStarted","Data":"ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.874579 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vtsxv" event={"ID":"76705148-274c-4428-9508-13fe1193646e","Type":"ContainerStarted","Data":"52e8848cb853a0dc3b72ab7abe99678676a1a3484d971d2212d9dc7e0814de5c"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.875583 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.875826 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.375803067 +0000 UTC m=+215.633342621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.875955 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" event={"ID":"44e0d81c-a6e7-4e95-9901-ea32b8476755","Type":"ContainerStarted","Data":"90ebcf3c9c1ac55298b5363da09070f712031dbaf6a4c00e2fd3545333fb5567"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.876213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.876586 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.376560614 +0000 UTC m=+215.634100168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.877619 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" event={"ID":"ae09caff-6233-41f8-bb7d-a2314363e2fa","Type":"ContainerStarted","Data":"326df085f5c1d99acbb7fcfd76f06d7ee211aa1d505e95c60c62d639a1aeac5b"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.878542 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" event={"ID":"010c6a41-8e2d-4391-ac1b-82814dad98a4","Type":"ContainerStarted","Data":"594da34db4ae926dcdf468cf745b0657133ee14da1e7caa9917bf23a62076b90"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.888294 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" event={"ID":"9d51c244-aac1-41de-adc4-2393a45392f1","Type":"ContainerStarted","Data":"ab2a19c721b76207a01169cb15c2cd97931ca774c3f35e5701adb61f0b59b53f"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.889579 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" event={"ID":"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237","Type":"ContainerStarted","Data":"a61f4755bdafb602d65547200450b5fc07abed6b2954007880e48fa224d12563"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.891257 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" event={"ID":"13b9671c-f825-49de-913c-42e8d161f7f8","Type":"ContainerStarted","Data":"d2bb1c7740f71774234caf6902fa0129c33e5b3a02856f65a25645b7a7da84bc"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.891292 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" event={"ID":"13b9671c-f825-49de-913c-42e8d161f7f8","Type":"ContainerStarted","Data":"d0fbb5bdc6e055cb4679da36d5719843e846e3bcebb13bedf63f016e29a71e4d"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.893451 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qjwrj" event={"ID":"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c","Type":"ContainerStarted","Data":"810d3cbf6a4bee635e4bba3ed460e3d254c4823916f364fc323d6d9564666098"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.895698 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" event={"ID":"396c6e41-89e8-4ecf-ac96-f73aad1f4bbb","Type":"ContainerStarted","Data":"ac56aa27a5bf69355adfe6792874ce1ed6ac0d76620dbe06be510d2e7aa2e337"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.897094 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" event={"ID":"d9ce11ed-3022-47e0-8150-8af94af65076","Type":"ContainerStarted","Data":"b9acad3269f03e3ff4c82c2d13789f039370f2693e20256519df2b53eb8f050e"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.898891 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" event={"ID":"cb172836-9833-43d5-a99b-cc01b3dd6694","Type":"ContainerStarted","Data":"7be9f81687da1d72f53f1f33763c2c529a0559acf3fd1b374e5534b715c61572"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.900038 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" event={"ID":"6497cf4e-c461-4db9-88e4-5de2a5f28404","Type":"ContainerStarted","Data":"eba09c3bf14b2017bd020ade926809be8e232218721a795e83f9c8c829c4c279"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.901088 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" event={"ID":"ac4a870d-8cda-423b-a15b-391830c944f4","Type":"ContainerStarted","Data":"ba656d94347ab7a46b01e399e4427f40464e1f1733075cff10d80842ea3064c5"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.902136 4781 generic.go:334] "Generic (PLEG): container finished" podID="b9dadb6a-e49e-4473-8338-3af567aacb4a" containerID="fba843b98f1d94d65c7a9a00944c4a2c075200df9aef3220e5fad50a748b7e1c" exitCode=0 Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.902282 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" event={"ID":"b9dadb6a-e49e-4473-8338-3af567aacb4a","Type":"ContainerDied","Data":"fba843b98f1d94d65c7a9a00944c4a2c075200df9aef3220e5fad50a748b7e1c"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.903365 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" event={"ID":"3f3571fd-ce1b-4105-9100-020fd1cd5076","Type":"ContainerStarted","Data":"9e3065a5ebe5119956c8b7ec1c757a3a6b50906c540cfd667e1750e8d94fbea7"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.904359 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" event={"ID":"14579b3e-131e-4e98-b060-a93d2581479c","Type":"ContainerStarted","Data":"c4d7bf6f0d3b9885760bd255588d753ccdced280441e504b58e960ca2ae484bc"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.905085 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" event={"ID":"98d3eede-8852-4bf5-a905-25974e47445f","Type":"ContainerStarted","Data":"0c726a221019f7d1c3295ad3f7aa546ff2c9eec6b98aeb53101e79567375dc69"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.905953 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" event={"ID":"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464","Type":"ContainerStarted","Data":"58dd5894fa816fda9c2863c52d5161febd70739593a31e4d6775f38563b29b0b"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.905975 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" event={"ID":"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464","Type":"ContainerStarted","Data":"f44d631dec8222980dc3c7da844439b422c7b0292e9e92d61e325d0c27b8fe0f"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.907286 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" event={"ID":"3ba2e306-8f79-4e15-8529-f3a16a0fa95f","Type":"ContainerStarted","Data":"a05bfa8d2e666094e1a4bda7adf713a5a9fa809cb67207952f2d86a464796379"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.908444 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" event={"ID":"6dc17f1d-c1f4-43b9-9291-7c32c6804d44","Type":"ContainerStarted","Data":"ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.908466 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" event={"ID":"6dc17f1d-c1f4-43b9-9291-7c32c6804d44","Type":"ContainerStarted","Data":"eb45173a1f629c7ad2883098f5964e4563b43bb7bdca30eb6fc3bc6e2ce93911"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.908976 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.909772 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2zw27" event={"ID":"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7","Type":"ContainerStarted","Data":"023ce4349c07e500b74b045fb2be36211c6c6c2639fc1ba445ec76055f8ab82c"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.910290 4781 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ktjdc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.910327 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.916790 4781 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wgpv7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.916843 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.977042 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.977223 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.477196298 +0000 UTC m=+215.734735862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.977270 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.977609 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.477598938 +0000 UTC m=+215.735138512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.033051 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.078325 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.079937 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.57991275 +0000 UTC m=+215.837452304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.080517 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.084032 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.584012524 +0000 UTC m=+215.841552078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.182475 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.183022 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.68298742 +0000 UTC m=+215.940526974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.288215 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.288577 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.788563587 +0000 UTC m=+216.046103141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.404696 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.404879 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.90485621 +0000 UTC m=+216.162395764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.405279 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.405762 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.905738371 +0000 UTC m=+216.163277925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.513600 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.515041 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.015021823 +0000 UTC m=+216.272561377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.611233 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fdkct"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.616680 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.617028 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.117015989 +0000 UTC m=+216.374555543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.666256 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d9gmh"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.711513 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.717438 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.717802 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.217788636 +0000 UTC m=+216.475328190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.749164 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wdgtd"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.819842 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.820538 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.320521018 +0000 UTC m=+216.578060572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.909823 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9z8qr"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.924509 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6rw4v"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.925259 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.925603 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.425568063 +0000 UTC m=+216.683107617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.943283 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" event={"ID":"5b5292e2-0434-46c6-ba9e-33622d4d5cbf","Type":"ContainerStarted","Data":"8cd984c740c34badb780e64f849955677d311b7300000ba185610d4ffa3f9a66"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.951344 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" event={"ID":"d5b604c3-aa52-42f3-8922-8edee056f016","Type":"ContainerStarted","Data":"fc89f23ae95a0a4526077950384af7148949bb2a8be42beaf602ae7350df2d54"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.958882 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" event={"ID":"ac4a870d-8cda-423b-a15b-391830c944f4","Type":"ContainerStarted","Data":"5143db84c6592055339523f957d8a21c8537d548fe1cf469c26896d35c317321"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.959990 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8lcg4" event={"ID":"6846d54c-4d22-46c7-b017-947a3986d773","Type":"ContainerStarted","Data":"9bbe669b2438e6fb16b2bad0cd53209261ddc8318eb8f31e08a3909a97e29905"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.964534 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" event={"ID":"010c6a41-8e2d-4391-ac1b-82814dad98a4","Type":"ContainerStarted","Data":"4646d8a82d02b503bb83315a976355ab19911d4ff25a7bb4a4d8efbfd2c3e181"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.974924 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zhrk"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.975488 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" event={"ID":"678f27fc-d210-4a4f-bd73-090378740da9","Type":"ContainerStarted","Data":"8e97fd8fcdef99a06975af07b11d983d49d1856c8a620f0853e184ef575d88e1"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.976276 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" podStartSLOduration=160.976253438 podStartE2EDuration="2m40.976253438s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:06.97459083 +0000 UTC m=+216.232130384" watchObservedRunningTime="2026-02-27 00:09:06.976253438 +0000 UTC m=+216.233793002" Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.994102 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2zw27" event={"ID":"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7","Type":"ContainerStarted","Data":"6f94b6cdd9853273e3a3490b567a3e1c2ed77f91dacc937621e1f8ea1b8ce8a9"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.995267 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.997706 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.999180 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" event={"ID":"e1c9b213-8c36-4ecf-831f-69a912f6364f","Type":"ContainerStarted","Data":"0b9131f6200bec26d0bbf0b742bbd481abca74e62ee0d568341df4edfb18e5df"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.009119 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sl77b" event={"ID":"38d33d08-97ce-49cb-b200-8ee30fc09e77","Type":"ContainerStarted","Data":"7a08987355db71a575fda8eac26f159da3be9b527fb24f30a4e536f887e83e3d"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.012333 4781 patch_prober.go:28] interesting pod/console-operator-58897d9998-2zw27 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.012398 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2zw27" podUID="55d8ebfe-a683-40f4-a3ef-bbeadb78ced7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.020742 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" event={"ID":"3f3571fd-ce1b-4105-9100-020fd1cd5076","Type":"ContainerStarted","Data":"a3b992a0fb4c2108c27aba88e7ef166a11d537392232af50898e2980eec06e23"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.024229 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535848-ccctv"] Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.026911 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.031739 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.531717464 +0000 UTC m=+216.789257198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.034097 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" podStartSLOduration=161.034072118 podStartE2EDuration="2m41.034072118s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.020951376 +0000 UTC m=+216.278490930" watchObservedRunningTime="2026-02-27 00:09:07.034072118 +0000 UTC m=+216.291611672" Feb 27 00:09:07 crc kubenswrapper[4781]: W0227 00:09:07.052829 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf626ec7_00c1_4ea9_9e8a_1e4a2b66431b.slice/crio-8fca182de0e27c23c7179b6016938f296917be6dc1ab0f25ef6496e1df363d8a WatchSource:0}: Error finding container 8fca182de0e27c23c7179b6016938f296917be6dc1ab0f25ef6496e1df363d8a: Status 404 returned error can't find the container with id 8fca182de0e27c23c7179b6016938f296917be6dc1ab0f25ef6496e1df363d8a Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.053823 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" event={"ID":"5cbee45f-1bdf-44e9-9782-83340ea69870","Type":"ContainerStarted","Data":"3fba6bbce7c2a12bf2c34025966ebefb7bc8e717b07803c5d4e37ed053b6d787"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.068944 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" podStartSLOduration=161.068921609 podStartE2EDuration="2m41.068921609s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.05459881 +0000 UTC m=+216.312138364" watchObservedRunningTime="2026-02-27 00:09:07.068921609 +0000 UTC m=+216.326461163" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.113234 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" event={"ID":"6497cf4e-c461-4db9-88e4-5de2a5f28404","Type":"ContainerStarted","Data":"664a75d7c7ee02d01dd0a52c2dad68c164ab167d31654e4a25c1a4eb7af6eeb3"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.115228 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.129267 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.130472 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.630450554 +0000 UTC m=+216.887990108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.154944 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" event={"ID":"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62","Type":"ContainerStarted","Data":"83dc0b93862e776c3facc82b45fa63497d5f2c98362fa8f812d593bf8c5410d3"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.158516 4781 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-sw7s5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.158591 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" podUID="6497cf4e-c461-4db9-88e4-5de2a5f28404" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.182755 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" event={"ID":"26e75b38-be64-4f34-933f-731abfe217b6","Type":"ContainerStarted","Data":"b2453b51fad73b4748df174667c74d32b7b0d1789503b3ac2aa68e83deb0364a"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.214907 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.215412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" event={"ID":"ae09caff-6233-41f8-bb7d-a2314363e2fa","Type":"ContainerStarted","Data":"52375f28f61199965177d7e1dae2db2484713d90c6d03abe75862f9141503421"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.216118 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.219326 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29535840-t9tlz"] Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.221959 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x"] Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.228170 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" event={"ID":"14db9d97-7da5-43c2-8d48-fb435f1a19d0","Type":"ContainerStarted","Data":"11a994d551e39f6d057d4851d8e1bebcd7f8950398cf71e37c1a5c0f55973eeb"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.231412 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.231657 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k8qh8"] Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.232017 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-vtsxv" podStartSLOduration=161.232006679 podStartE2EDuration="2m41.232006679s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.175761266 +0000 UTC m=+216.433300830" watchObservedRunningTime="2026-02-27 00:09:07.232006679 +0000 UTC m=+216.489546233" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.233291 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.733275518 +0000 UTC m=+216.990815152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.234017 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.249957 4781 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mmx87 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.250322 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" podUID="ae09caff-6233-41f8-bb7d-a2314363e2fa" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.252179 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" podStartSLOduration=161.252169942 podStartE2EDuration="2m41.252169942s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.249227395 +0000 UTC m=+216.506766949" watchObservedRunningTime="2026-02-27 00:09:07.252169942 +0000 UTC m=+216.509709496" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.253397 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-574r8"] Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.253431 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" event={"ID":"98d3eede-8852-4bf5-a905-25974e47445f","Type":"ContainerStarted","Data":"befb4f7692e3d74436fee0560fd38addd6635f19d53ef755b808ff07a5fca84a"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.258192 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" event={"ID":"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237","Type":"ContainerStarted","Data":"fdcf36e8f25a660de079c42743efc1da2e0c2077dacdf20ab9116c4c2a9cdaab"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.259055 4781 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-swgz7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.259095 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" podUID="a24423db-53f2-4555-81e4-228b3911e144" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.259321 4781 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wgpv7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.259337 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.260183 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.260217 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:07 crc kubenswrapper[4781]: W0227 00:09:07.273728 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd798400_ea88_4aad_ae19_815b6b8d57da.slice/crio-ddc28522b63ffd34c2e34573e44510bb7bf3c28d3b9016bc7f9f73b81b7cca21 WatchSource:0}: Error finding container ddc28522b63ffd34c2e34573e44510bb7bf3c28d3b9016bc7f9f73b81b7cca21: Status 404 returned error can't find the container with id ddc28522b63ffd34c2e34573e44510bb7bf3c28d3b9016bc7f9f73b81b7cca21 Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.332611 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.332960 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.832930759 +0000 UTC m=+217.090470313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.339462 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.340269 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.840234647 +0000 UTC m=+217.097774201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.340288 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" podStartSLOduration=161.340261928 podStartE2EDuration="2m41.340261928s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.303557554 +0000 UTC m=+216.561097108" watchObservedRunningTime="2026-02-27 00:09:07.340261928 +0000 UTC m=+216.597801482" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.397006 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" podStartSLOduration=162.396987012 podStartE2EDuration="2m42.396987012s" podCreationTimestamp="2026-02-27 00:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.331923536 +0000 UTC m=+216.589463100" watchObservedRunningTime="2026-02-27 00:09:07.396987012 +0000 UTC m=+216.654526566" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.398562 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" podStartSLOduration=161.398555488 podStartE2EDuration="2m41.398555488s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.39557328 +0000 UTC m=+216.653112834" watchObservedRunningTime="2026-02-27 00:09:07.398555488 +0000 UTC m=+216.656095042" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.449088 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.449393 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.949378607 +0000 UTC m=+217.206918161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.484149 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" podStartSLOduration=161.482767755 podStartE2EDuration="2m41.482767755s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.47907289 +0000 UTC m=+216.736612444" watchObservedRunningTime="2026-02-27 00:09:07.482767755 +0000 UTC m=+216.740307319" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.538583 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-2zw27" podStartSLOduration=161.538567527 podStartE2EDuration="2m41.538567527s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.538282831 +0000 UTC m=+216.795822385" watchObservedRunningTime="2026-02-27 00:09:07.538567527 +0000 UTC m=+216.796107081" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.540319 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qjwrj" podStartSLOduration=161.540311928 podStartE2EDuration="2m41.540311928s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.502814776 +0000 UTC m=+216.760354350" watchObservedRunningTime="2026-02-27 00:09:07.540311928 +0000 UTC m=+216.797851482" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.552276 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.552578 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.052567029 +0000 UTC m=+217.310106583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.622030 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" podStartSLOduration=161.622014196 podStartE2EDuration="2m41.622014196s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.583825358 +0000 UTC m=+216.841364912" watchObservedRunningTime="2026-02-27 00:09:07.622014196 +0000 UTC m=+216.879553750" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.659290 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.660507 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.160492631 +0000 UTC m=+217.418032185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.661466 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" podStartSLOduration=161.661445293 podStartE2EDuration="2m41.661445293s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.660282326 +0000 UTC m=+216.917821890" watchObservedRunningTime="2026-02-27 00:09:07.661445293 +0000 UTC m=+216.918984847" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.665578 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" podStartSLOduration=161.665562427 podStartE2EDuration="2m41.665562427s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.621214818 +0000 UTC m=+216.878754372" watchObservedRunningTime="2026-02-27 00:09:07.665562427 +0000 UTC m=+216.923101981" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.666577 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.667423 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.16739254 +0000 UTC m=+217.424932094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.697393 4781 ???:1] "http: TLS handshake error from 192.168.126.11:36582: no serving certificate available for the kubelet" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.702141 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" podStartSLOduration=161.702123958 podStartE2EDuration="2m41.702123958s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.701024053 +0000 UTC m=+216.958563627" watchObservedRunningTime="2026-02-27 00:09:07.702123958 +0000 UTC m=+216.959663512" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.736267 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" podStartSLOduration=161.736251533 podStartE2EDuration="2m41.736251533s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.732438245 +0000 UTC m=+216.989977799" watchObservedRunningTime="2026-02-27 00:09:07.736251533 +0000 UTC m=+216.993791087" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.768099 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.768467 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.268450543 +0000 UTC m=+217.525990087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.804499 4781 ???:1] "http: TLS handshake error from 192.168.126.11:36590: no serving certificate available for the kubelet" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.870304 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.870730 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.370710644 +0000 UTC m=+217.628250198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.899843 4781 ???:1] "http: TLS handshake error from 192.168.126.11:36600: no serving certificate available for the kubelet" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.971514 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.971900 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.471789959 +0000 UTC m=+217.729329503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.972063 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.974950 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.474927181 +0000 UTC m=+217.732466735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.987734 4781 ???:1] "http: TLS handshake error from 192.168.126.11:36610: no serving certificate available for the kubelet" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.079758 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.080097 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.580081198 +0000 UTC m=+217.837620752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.084004 4781 ???:1] "http: TLS handshake error from 192.168.126.11:36624: no serving certificate available for the kubelet" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.182410 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.184114 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.6840993 +0000 UTC m=+217.941638854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.260331 4781 ???:1] "http: TLS handshake error from 192.168.126.11:36634: no serving certificate available for the kubelet" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.272948 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8lcg4" event={"ID":"6846d54c-4d22-46c7-b017-947a3986d773","Type":"ContainerStarted","Data":"3ec609bc241485334906f05e040e09ce986640ade2b04f4e43a37b8fd22b2fc8"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.286809 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" event={"ID":"bf32f77b-92ad-479d-8ee3-423f16089eb6","Type":"ContainerStarted","Data":"5224c25a7e621a9e9012e4e11e4ee9613e1854fc7844ad721ae3070b7213daec"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.286853 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" event={"ID":"bf32f77b-92ad-479d-8ee3-423f16089eb6","Type":"ContainerStarted","Data":"8ceccf018021f28a76b1767c0a451cf32f6d8967a8d2e2f452ef8818c044517d"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.287314 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.287605 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.78759109 +0000 UTC m=+218.045130644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.288796 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" event={"ID":"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59","Type":"ContainerStarted","Data":"8fcf8e9b27758ab1e005f19315672289f24b12042f2722b628ae3a3a01a876a4"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.307874 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" event={"ID":"d5b604c3-aa52-42f3-8922-8edee056f016","Type":"ContainerStarted","Data":"32c4427ac5091334a11909736774f7bd80659ddd13e8cd48c20b8b1c1e4cbe4d"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.309304 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8lcg4" podStartSLOduration=162.309292559 podStartE2EDuration="2m42.309292559s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.307537008 +0000 UTC m=+217.565076562" watchObservedRunningTime="2026-02-27 00:09:08.309292559 +0000 UTC m=+217.566832113" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.324609 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" event={"ID":"d9ce11ed-3022-47e0-8150-8af94af65076","Type":"ContainerStarted","Data":"59fc59679a172212b6e6db1643066e055ceeb6fccb9ef1c0f52b61b5a2e331b4"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.333109 4781 generic.go:334] "Generic (PLEG): container finished" podID="e1c9b213-8c36-4ecf-831f-69a912f6364f" containerID="ad4696dc33a53d001e1c01038e280aa12b2d5bb8553786ed2d43c27cd13a084d" exitCode=0 Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.333371 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" event={"ID":"e1c9b213-8c36-4ecf-831f-69a912f6364f","Type":"ContainerDied","Data":"ad4696dc33a53d001e1c01038e280aa12b2d5bb8553786ed2d43c27cd13a084d"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.347304 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" podStartSLOduration=163.347286592 podStartE2EDuration="2m43.347286592s" podCreationTimestamp="2026-02-27 00:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.343793672 +0000 UTC m=+217.601333226" watchObservedRunningTime="2026-02-27 00:09:08.347286592 +0000 UTC m=+217.604826146" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.360883 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-574r8" event={"ID":"bd798400-ea88-4aad-ae19-815b6b8d57da","Type":"ContainerStarted","Data":"21771b43c514592a65ad5552e1e929b9c5b7df3a909f999160490d3099d2a8da"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.360937 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-574r8" event={"ID":"bd798400-ea88-4aad-ae19-815b6b8d57da","Type":"ContainerStarted","Data":"ddc28522b63ffd34c2e34573e44510bb7bf3c28d3b9016bc7f9f73b81b7cca21"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.363829 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.366061 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.366124 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.379393 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" event={"ID":"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62","Type":"ContainerStarted","Data":"9929fa21f47069f64f78d6c0d0314ca1c08d1dad35bf6dc9830c9601924c0444"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.392207 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.392653 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.892619995 +0000 UTC m=+218.150159549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.393754 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" event={"ID":"16f4a859-d834-408d-9a9c-4d293b47d95a","Type":"ContainerStarted","Data":"5ab3a56e8c7b0d7634469a7ff6fbbb6939fb6353224b40e1746b82a0d6807700"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.393815 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" event={"ID":"16f4a859-d834-408d-9a9c-4d293b47d95a","Type":"ContainerStarted","Data":"823a400d2793709daba559c6a9b76da7307e5273d5a5e1dc43fd258e3e48c0b9"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.432853 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" event={"ID":"ac4a870d-8cda-423b-a15b-391830c944f4","Type":"ContainerStarted","Data":"06fc308c716d99e0b9739a3e92c8410f2d714a0cf2a58882f1bef1fe37557956"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.456381 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" podStartSLOduration=163.45636534 podStartE2EDuration="2m43.45636534s" podCreationTimestamp="2026-02-27 00:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.408124911 +0000 UTC m=+217.665664475" watchObservedRunningTime="2026-02-27 00:09:08.45636534 +0000 UTC m=+217.713904894" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.469229 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" event={"ID":"1234fab4-2533-4255-bdc2-dd1c3d3d61b5","Type":"ContainerStarted","Data":"02fcc16946420d5ff5300df02cb41acee13fa00816c45b9fd03ae7735368202e"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.469275 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" event={"ID":"1234fab4-2533-4255-bdc2-dd1c3d3d61b5","Type":"ContainerStarted","Data":"19ec0c61709e31510f36979c4154d8a604c364f2f1e4a73a25e64466646bfcde"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.469493 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.481881 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29535840-t9tlz" event={"ID":"91e2c481-01ee-461f-bc5b-d09b7ea221c5","Type":"ContainerStarted","Data":"34034ef1e924a05fbc92daf60e2f0c105f332a30b0fe9cea72b0da3d3065e13e"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.481926 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29535840-t9tlz" event={"ID":"91e2c481-01ee-461f-bc5b-d09b7ea221c5","Type":"ContainerStarted","Data":"02350f41c01977124604e142f885201d5743582263439e32be7f03871d0f9773"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.497811 4781 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7vd5x container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.498320 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" podUID="1234fab4-2533-4255-bdc2-dd1c3d3d61b5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.498173 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.498234 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.998218492 +0000 UTC m=+218.255758046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.498905 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.501892 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.001877237 +0000 UTC m=+218.259416781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.505022 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535848-ccctv" event={"ID":"df035290-8e3c-422b-90ac-573b592defcf","Type":"ContainerStarted","Data":"73bd0b78edcc81c67b914cc89cfaf8646b9814d5783ad5e9856330864dac671a"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.521003 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sl77b" event={"ID":"38d33d08-97ce-49cb-b200-8ee30fc09e77","Type":"ContainerStarted","Data":"42f8def94e6d3c12464c6b3feeae43ac20bea1ebb7ffb0eeeb0d126d2bc13cb0"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.523453 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" event={"ID":"b9dadb6a-e49e-4473-8338-3af567aacb4a","Type":"ContainerStarted","Data":"ac40adffdd127da39ae2507907fb38373b6d69fa918c3ddcfaee6d0e52368b01"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.581513 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" event={"ID":"5cbee45f-1bdf-44e9-9782-83340ea69870","Type":"ContainerStarted","Data":"acde24607c7b0fb059c97edcf650a34ce01281254111d33a16097e2c6937e171"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.581561 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" event={"ID":"5cbee45f-1bdf-44e9-9782-83340ea69870","Type":"ContainerStarted","Data":"b975438d9bf0effc0f383ed3409cf91c5425e96642be5f7f94a4b6b6f475b948"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.582130 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.606082 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" event={"ID":"678f27fc-d210-4a4f-bd73-090378740da9","Type":"ContainerStarted","Data":"898ccef1da25e7c00fcd11040419fe4b505ada16cb26d62d9a4806872cb68348"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.608039 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.609256 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.109242885 +0000 UTC m=+218.366782439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.616985 4781 ???:1] "http: TLS handshake error from 192.168.126.11:56562: no serving certificate available for the kubelet" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.635297 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" event={"ID":"3f3571fd-ce1b-4105-9100-020fd1cd5076","Type":"ContainerStarted","Data":"a6cfbebe704fb1381bfafda6527c42b651059ad2455446c3086177fe8be79344"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.637079 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" event={"ID":"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b","Type":"ContainerStarted","Data":"126cdc4a0b2277218672d3ebb95a09f31d56ce658295a2f26c3118300b77f809"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.637101 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" event={"ID":"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b","Type":"ContainerStarted","Data":"8fca182de0e27c23c7179b6016938f296917be6dc1ab0f25ef6496e1df363d8a"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.637541 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.675218 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" event={"ID":"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464","Type":"ContainerStarted","Data":"63cb1f3b771058d589224d8fb22198abea8226ee472d22ccf972deff4e404cc9"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.675586 4781 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2zhrk container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" start-of-body= Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.675662 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.676765 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-574r8" podStartSLOduration=7.6767532769999995 podStartE2EDuration="7.676753277s" podCreationTimestamp="2026-02-27 00:09:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.676098402 +0000 UTC m=+217.933637956" watchObservedRunningTime="2026-02-27 00:09:08.676753277 +0000 UTC m=+217.934292821" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.678764 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" podStartSLOduration=162.678756854 podStartE2EDuration="2m42.678756854s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.582336717 +0000 UTC m=+217.839876271" watchObservedRunningTime="2026-02-27 00:09:08.678756854 +0000 UTC m=+217.936296398" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.703154 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" event={"ID":"396c6e41-89e8-4ecf-ac96-f73aad1f4bbb","Type":"ContainerStarted","Data":"ccd51fda8823b76ef5845848f8d0690dfeeabdb21391d0115db34185a41b1b97"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.703211 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" event={"ID":"396c6e41-89e8-4ecf-ac96-f73aad1f4bbb","Type":"ContainerStarted","Data":"3a16f1a6ffa2d016c4c94897233d60791a1997b713a1593909bcc29419e196c4"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.706102 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qjwrj" event={"ID":"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c","Type":"ContainerStarted","Data":"a7c8f7063bafc919361346a9dc1315b92859e40920e9b450653ad64294dfaf96"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.710443 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.711175 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.711220 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.711543 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.211527847 +0000 UTC m=+218.469067491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.746747 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" event={"ID":"98d3eede-8852-4bf5-a905-25974e47445f","Type":"ContainerStarted","Data":"c3c25fbc57e5d781df69b1687c08ccf96eb500c411327096c0537e00a427f258"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.753779 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" event={"ID":"9d51c244-aac1-41de-adc4-2393a45392f1","Type":"ContainerStarted","Data":"3695a4c6299a6ec731f058159a25a778df05ce7dbe7bc7d51aa249ce1349c630"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.767819 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k8qh8" event={"ID":"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5","Type":"ContainerStarted","Data":"c68ec309647256b6cd7991fa50c2092e7afd1452b00f47baaff4d838af7fd462"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.768050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k8qh8" event={"ID":"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5","Type":"ContainerStarted","Data":"c0e1c18fc197bd478ed251faf34738158576ad1a668568d4158f01011e81ffdb"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.803280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" event={"ID":"14db9d97-7da5-43c2-8d48-fb435f1a19d0","Type":"ContainerStarted","Data":"07cbadf40a6f8fc1edfbfb6dfbab48131215d5e9beb738042ced3279ca75fad1"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.803722 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" event={"ID":"14db9d97-7da5-43c2-8d48-fb435f1a19d0","Type":"ContainerStarted","Data":"e2e4c66c8d45404b36d849e8a515fd4de6c19674d5072fa80ab2cee64f2a7f7f"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.803727 4781 patch_prober.go:28] interesting pod/console-operator-58897d9998-2zw27 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.803784 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2zw27" podUID="55d8ebfe-a683-40f4-a3ef-bbeadb78ced7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.813125 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.814445 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.314424443 +0000 UTC m=+218.571963997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.855939 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" podStartSLOduration=162.855906617 podStartE2EDuration="2m42.855906617s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.852822366 +0000 UTC m=+218.110361930" watchObservedRunningTime="2026-02-27 00:09:08.855906617 +0000 UTC m=+218.113446161" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.856927 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" podStartSLOduration=162.8569225 podStartE2EDuration="2m42.8569225s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.758704652 +0000 UTC m=+218.016244206" watchObservedRunningTime="2026-02-27 00:09:08.8569225 +0000 UTC m=+218.114462054" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.884339 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.915189 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.916220 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.416208073 +0000 UTC m=+218.673747627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.928736 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29535840-t9tlz" podStartSLOduration=162.928718161 podStartE2EDuration="2m42.928718161s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.927842611 +0000 UTC m=+218.185382165" watchObservedRunningTime="2026-02-27 00:09:08.928718161 +0000 UTC m=+218.186257715" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.018027 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.018512 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.518495885 +0000 UTC m=+218.776035439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.060559 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" podStartSLOduration=163.060540082 podStartE2EDuration="2m43.060540082s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.009050468 +0000 UTC m=+218.266590022" watchObservedRunningTime="2026-02-27 00:09:09.060540082 +0000 UTC m=+218.318079636" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.076247 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.076516 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.090737 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.091306 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.097992 4781 ???:1] "http: TLS handshake error from 192.168.126.11:56568: no serving certificate available for the kubelet" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.126186 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" podStartSLOduration=163.126171801 podStartE2EDuration="2m43.126171801s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.125324801 +0000 UTC m=+218.382864355" watchObservedRunningTime="2026-02-27 00:09:09.126171801 +0000 UTC m=+218.383711355" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.127342 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.127826 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.627810338 +0000 UTC m=+218.885349892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.128443 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" podStartSLOduration=163.128431393 podStartE2EDuration="2m43.128431393s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.062495167 +0000 UTC m=+218.320034721" watchObservedRunningTime="2026-02-27 00:09:09.128431393 +0000 UTC m=+218.385970977" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.196704 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" podStartSLOduration=163.196687652 podStartE2EDuration="2m43.196687652s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.19443161 +0000 UTC m=+218.451971164" watchObservedRunningTime="2026-02-27 00:09:09.196687652 +0000 UTC m=+218.454227206" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.207842 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.228456 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.228885 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.728870272 +0000 UTC m=+218.986409826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.305367 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" podStartSLOduration=163.30533323 podStartE2EDuration="2m43.30533323s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.245133356 +0000 UTC m=+218.502672900" watchObservedRunningTime="2026-02-27 00:09:09.30533323 +0000 UTC m=+218.562872794" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.330412 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.330771 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.830759315 +0000 UTC m=+219.088298869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.367375 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" podStartSLOduration=163.367357356 podStartE2EDuration="2m43.367357356s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.318868101 +0000 UTC m=+218.576407645" watchObservedRunningTime="2026-02-27 00:09:09.367357356 +0000 UTC m=+218.624896910" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.418956 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:09 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:09 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:09 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.419043 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.431819 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.432231 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.932214598 +0000 UTC m=+219.189754152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.439643 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-sl77b" podStartSLOduration=7.439612868 podStartE2EDuration="7.439612868s" podCreationTimestamp="2026-02-27 00:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.375464163 +0000 UTC m=+218.633003717" watchObservedRunningTime="2026-02-27 00:09:09.439612868 +0000 UTC m=+218.697152422" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.440069 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" podStartSLOduration=163.440064898 podStartE2EDuration="2m43.440064898s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.437356516 +0000 UTC m=+218.694896070" watchObservedRunningTime="2026-02-27 00:09:09.440064898 +0000 UTC m=+218.697604462" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.522398 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" podStartSLOduration=163.522380511 podStartE2EDuration="2m43.522380511s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.521991862 +0000 UTC m=+218.779531416" watchObservedRunningTime="2026-02-27 00:09:09.522380511 +0000 UTC m=+218.779920065" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.533645 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.534521 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.03450869 +0000 UTC m=+219.292048244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.554040 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" podStartSLOduration=163.554009148 podStartE2EDuration="2m43.554009148s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.551859349 +0000 UTC m=+218.809398903" watchObservedRunningTime="2026-02-27 00:09:09.554009148 +0000 UTC m=+218.811548692" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.623424 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" podStartSLOduration=163.623386713 podStartE2EDuration="2m43.623386713s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.615086072 +0000 UTC m=+218.872625636" watchObservedRunningTime="2026-02-27 00:09:09.623386713 +0000 UTC m=+218.880926267" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.634668 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.634944 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.134899478 +0000 UTC m=+219.392439032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.635352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.635717 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.135700376 +0000 UTC m=+219.393239930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.711480 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" podStartSLOduration=163.711464268 podStartE2EDuration="2m43.711464268s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.710642569 +0000 UTC m=+218.968182123" watchObservedRunningTime="2026-02-27 00:09:09.711464268 +0000 UTC m=+218.969003822" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.737000 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.737320 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.237305402 +0000 UTC m=+219.494844956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.834374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" event={"ID":"e1c9b213-8c36-4ecf-831f-69a912f6364f","Type":"ContainerStarted","Data":"af4073fdeecca7a7aa604ec924d63a357759f8c158c1a298f15e9d389ac98486"} Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.834478 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.839449 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.839763 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.339750087 +0000 UTC m=+219.597289631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.846808 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" event={"ID":"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59","Type":"ContainerStarted","Data":"d792e54137f4bdb958e476f08a383626f9798d849f05c35836509bfd2561a429"} Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.846852 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" event={"ID":"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59","Type":"ContainerStarted","Data":"8d62d374d868463d06a572a2831fb88e13d49e7e6c27c1eea169b4b6fd868e01"} Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.848844 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" event={"ID":"d5b604c3-aa52-42f3-8922-8edee056f016","Type":"ContainerStarted","Data":"226ab1af46a96d5e2d3021413e6b3e5e048d3912a0c3b00819349e01a3736a07"} Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.858597 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k8qh8" event={"ID":"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5","Type":"ContainerStarted","Data":"eca78ef0786186f4f67e7ce7e0815c48a2dae136d8020f874ef92b453160d9c0"} Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.859269 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.869981 4781 ???:1] "http: TLS handshake error from 192.168.126.11:56574: no serving certificate available for the kubelet" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.873915 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" event={"ID":"26e75b38-be64-4f34-933f-731abfe217b6","Type":"ContainerStarted","Data":"6b0aab158a544e1afdf453eadcba2b38defe3314e343d756ffb848145161f206"} Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.884066 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.884116 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.921863 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.940521 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.942120 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.44210465 +0000 UTC m=+219.699644204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.042804 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.047796 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.54778352 +0000 UTC m=+219.805323074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.135815 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" podStartSLOduration=164.135784324 podStartE2EDuration="2m44.135784324s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:10.032498689 +0000 UTC m=+219.290038243" watchObservedRunningTime="2026-02-27 00:09:10.135784324 +0000 UTC m=+219.393323888" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.144191 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.144408 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.644388551 +0000 UTC m=+219.901928105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.144533 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.144864 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.644856562 +0000 UTC m=+219.902396116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.238733 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" podStartSLOduration=164.23871337 podStartE2EDuration="2m44.23871337s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:10.190120823 +0000 UTC m=+219.447660387" watchObservedRunningTime="2026-02-27 00:09:10.23871337 +0000 UTC m=+219.496252924" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.249203 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.249525 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.749509988 +0000 UTC m=+220.007049542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.318731 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-k8qh8" podStartSLOduration=8.31870956 podStartE2EDuration="8.31870956s" podCreationTimestamp="2026-02-27 00:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:10.24130413 +0000 UTC m=+219.498843684" watchObservedRunningTime="2026-02-27 00:09:10.31870956 +0000 UTC m=+219.576249124" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.350324 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.350696 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.850684485 +0000 UTC m=+220.108224039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.366615 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:10 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:10 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:10 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.366680 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.451949 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.452375 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.952358712 +0000 UTC m=+220.209898266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.553345 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.553695 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.053679702 +0000 UTC m=+220.311219256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.566605 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" podStartSLOduration=164.566585229 podStartE2EDuration="2m44.566585229s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:10.322387464 +0000 UTC m=+219.579927028" watchObservedRunningTime="2026-02-27 00:09:10.566585229 +0000 UTC m=+219.824124783" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.569324 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ktjdc"] Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.569516 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerName="controller-manager" containerID="cri-o://7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710" gracePeriod=30 Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.601945 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.655052 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.655586 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.155563965 +0000 UTC m=+220.413103519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.661283 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.663081 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.665866 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7"] Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.666072 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" podUID="a24423db-53f2-4555-81e4-228b3911e144" containerName="route-controller-manager" containerID="cri-o://1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3" gracePeriod=30 Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.681238 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.758387 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.758733 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.258720546 +0000 UTC m=+220.516260090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.864590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.865163 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.365140303 +0000 UTC m=+220.622679847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.874048 4781 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2zhrk container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.20:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.874117 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.20:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.912106 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.967687 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.971323 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.471305294 +0000 UTC m=+220.728844848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.071847 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.071964 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.571948448 +0000 UTC m=+220.829488002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.072361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.072659 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.572651624 +0000 UTC m=+220.830191178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.089820 4781 patch_prober.go:28] interesting pod/apiserver-76f77b778f-cr2bb container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 27 00:09:11 crc kubenswrapper[4781]: [+]log ok Feb 27 00:09:11 crc kubenswrapper[4781]: [+]etcd ok Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/start-apiserver-admission-initializer failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [+]poststarthook/generic-apiserver-start-informers ok Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/max-in-flight-filter failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 27 00:09:11 crc kubenswrapper[4781]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [+]poststarthook/project.openshift.io-projectcache ok Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/project.openshift.io-projectauthorizationcache failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/openshift.io-restmapperupdater failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/quota.openshift.io-clusterquotamapping failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: livez check failed Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.089896 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" podUID="d9ce11ed-3022-47e0-8150-8af94af65076" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.173601 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.173999 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.673984464 +0000 UTC m=+220.931524018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.191188 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.210514 4781 ???:1] "http: TLS handshake error from 192.168.126.11:56584: no serving certificate available for the kubelet" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.275162 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.275716 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.775698513 +0000 UTC m=+221.033238067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.367842 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:11 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.367898 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.376363 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.376591 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.876546742 +0000 UTC m=+221.134086296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.376789 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.377084 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.877070524 +0000 UTC m=+221.134610078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.475905 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-42hbx"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.478137 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.478435 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.978417174 +0000 UTC m=+221.235956728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.479987 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.480029 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-42hbx"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.493310 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.559679 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.582490 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-utilities\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.582559 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgndh\" (UniqueName: \"kubernetes.io/projected/19ed5401-2778-4266-8bf1-1c7244dac100-kube-api-access-xgndh\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.582589 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-catalog-content\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.582612 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.582896 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.082883716 +0000 UTC m=+221.340423260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.663180 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.663202 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kztqg"] Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.663659 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerName="controller-manager" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.663680 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerName="controller-manager" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.663699 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24423db-53f2-4555-81e4-228b3911e144" containerName="route-controller-manager" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.663706 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24423db-53f2-4555-81e4-228b3911e144" containerName="route-controller-manager" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.663811 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerName="controller-manager" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.663833 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a24423db-53f2-4555-81e4-228b3911e144" containerName="route-controller-manager" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.664525 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.670113 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.673122 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kztqg"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.683499 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-client-ca\") pod \"c7332c18-9748-49d2-b512-a46c2d1fcb79\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.683555 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7332c18-9748-49d2-b512-a46c2d1fcb79-serving-cert\") pod \"c7332c18-9748-49d2-b512-a46c2d1fcb79\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.683605 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-proxy-ca-bundles\") pod \"c7332c18-9748-49d2-b512-a46c2d1fcb79\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.683736 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.683800 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-config\") pod \"c7332c18-9748-49d2-b512-a46c2d1fcb79\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.683823 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnbwb\" (UniqueName: \"kubernetes.io/projected/c7332c18-9748-49d2-b512-a46c2d1fcb79-kube-api-access-tnbwb\") pod \"c7332c18-9748-49d2-b512-a46c2d1fcb79\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.683957 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.183932609 +0000 UTC m=+221.441472163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.684168 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-utilities\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.684240 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgndh\" (UniqueName: \"kubernetes.io/projected/19ed5401-2778-4266-8bf1-1c7244dac100-kube-api-access-xgndh\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.684274 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-catalog-content\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.684296 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.684585 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.184578204 +0000 UTC m=+221.442117758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.685285 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-config" (OuterVolumeSpecName: "config") pod "c7332c18-9748-49d2-b512-a46c2d1fcb79" (UID: "c7332c18-9748-49d2-b512-a46c2d1fcb79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.685480 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-utilities\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.685829 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c7332c18-9748-49d2-b512-a46c2d1fcb79" (UID: "c7332c18-9748-49d2-b512-a46c2d1fcb79"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.686240 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-catalog-content\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.686296 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-client-ca" (OuterVolumeSpecName: "client-ca") pod "c7332c18-9748-49d2-b512-a46c2d1fcb79" (UID: "c7332c18-9748-49d2-b512-a46c2d1fcb79"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.693897 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7332c18-9748-49d2-b512-a46c2d1fcb79-kube-api-access-tnbwb" (OuterVolumeSpecName: "kube-api-access-tnbwb") pod "c7332c18-9748-49d2-b512-a46c2d1fcb79" (UID: "c7332c18-9748-49d2-b512-a46c2d1fcb79"). InnerVolumeSpecName "kube-api-access-tnbwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.694027 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7332c18-9748-49d2-b512-a46c2d1fcb79-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c7332c18-9748-49d2-b512-a46c2d1fcb79" (UID: "c7332c18-9748-49d2-b512-a46c2d1fcb79"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.712418 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgndh\" (UniqueName: \"kubernetes.io/projected/19ed5401-2778-4266-8bf1-1c7244dac100-kube-api-access-xgndh\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.762216 4781 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.785582 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24423db-53f2-4555-81e4-228b3911e144-serving-cert\") pod \"a24423db-53f2-4555-81e4-228b3911e144\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.785677 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrg6p\" (UniqueName: \"kubernetes.io/projected/a24423db-53f2-4555-81e4-228b3911e144-kube-api-access-xrg6p\") pod \"a24423db-53f2-4555-81e4-228b3911e144\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.785724 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-config\") pod \"a24423db-53f2-4555-81e4-228b3911e144\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.785747 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-client-ca\") pod \"a24423db-53f2-4555-81e4-228b3911e144\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.785894 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786227 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpnxw\" (UniqueName: \"kubernetes.io/projected/2b050e9e-d6c8-4e27-ad3f-9681553c1539-kube-api-access-zpnxw\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786293 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-utilities\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786348 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-catalog-content\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786409 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786422 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnbwb\" (UniqueName: \"kubernetes.io/projected/c7332c18-9748-49d2-b512-a46c2d1fcb79-kube-api-access-tnbwb\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786432 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786443 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7332c18-9748-49d2-b512-a46c2d1fcb79-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786454 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.786713 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.286668661 +0000 UTC m=+221.544208215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786801 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-config" (OuterVolumeSpecName: "config") pod "a24423db-53f2-4555-81e4-228b3911e144" (UID: "a24423db-53f2-4555-81e4-228b3911e144"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786949 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-client-ca" (OuterVolumeSpecName: "client-ca") pod "a24423db-53f2-4555-81e4-228b3911e144" (UID: "a24423db-53f2-4555-81e4-228b3911e144"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.790951 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a24423db-53f2-4555-81e4-228b3911e144-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a24423db-53f2-4555-81e4-228b3911e144" (UID: "a24423db-53f2-4555-81e4-228b3911e144"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.811752 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a24423db-53f2-4555-81e4-228b3911e144-kube-api-access-xrg6p" (OuterVolumeSpecName: "kube-api-access-xrg6p") pod "a24423db-53f2-4555-81e4-228b3911e144" (UID: "a24423db-53f2-4555-81e4-228b3911e144"). InnerVolumeSpecName "kube-api-access-xrg6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.846007 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.864845 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kqrgb"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.865757 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.888553 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kqrgb"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.888578 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpnxw\" (UniqueName: \"kubernetes.io/projected/2b050e9e-d6c8-4e27-ad3f-9681553c1539-kube-api-access-zpnxw\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.889295 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-utilities\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.889390 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-catalog-content\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.890162 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-utilities\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.890671 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-catalog-content\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.891969 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.391946152 +0000 UTC m=+221.649485706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.889542 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.892781 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.892797 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.892811 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24423db-53f2-4555-81e4-228b3911e144-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.892825 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrg6p\" (UniqueName: \"kubernetes.io/projected/a24423db-53f2-4555-81e4-228b3911e144-kube-api-access-xrg6p\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.910452 4781 generic.go:334] "Generic (PLEG): container finished" podID="a24423db-53f2-4555-81e4-228b3911e144" containerID="1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3" exitCode=0 Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.910515 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" event={"ID":"a24423db-53f2-4555-81e4-228b3911e144","Type":"ContainerDied","Data":"1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3"} Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.910543 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" event={"ID":"a24423db-53f2-4555-81e4-228b3911e144","Type":"ContainerDied","Data":"e2d52d381f5f2c2aa6e3b3529d449b9cd90d4bab2b2b6374496041f06b7f95d6"} Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.910561 4781 scope.go:117] "RemoveContainer" containerID="1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.910748 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.916015 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpnxw\" (UniqueName: \"kubernetes.io/projected/2b050e9e-d6c8-4e27-ad3f-9681553c1539-kube-api-access-zpnxw\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.916983 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" event={"ID":"26e75b38-be64-4f34-933f-731abfe217b6","Type":"ContainerStarted","Data":"e131d45bbf0a76e83c9db42c296a6f6c98df038ffcda0bb488d5e7953b3020f1"} Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.917033 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" event={"ID":"26e75b38-be64-4f34-933f-731abfe217b6","Type":"ContainerStarted","Data":"99370a1ca3aba85234b46ccb3551132e01f233f017156fc02713ad284ef2946a"} Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.918489 4781 generic.go:334] "Generic (PLEG): container finished" podID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerID="7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710" exitCode=0 Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.918546 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.918575 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" event={"ID":"c7332c18-9748-49d2-b512-a46c2d1fcb79","Type":"ContainerDied","Data":"7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710"} Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.918591 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" event={"ID":"c7332c18-9748-49d2-b512-a46c2d1fcb79","Type":"ContainerDied","Data":"569469b156a3d6f73fda1c00c629b8cfcf29a4662b4eccaa3dcb213bb4a0f1d1"} Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.942688 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.945211 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.947822 4781 scope.go:117] "RemoveContainer" containerID="1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.948213 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3\": container with ID starting with 1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3 not found: ID does not exist" containerID="1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.948245 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3"} err="failed to get container status \"1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3\": rpc error: code = NotFound desc = could not find container \"1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3\": container with ID starting with 1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3 not found: ID does not exist" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.948264 4781 scope.go:117] "RemoveContainer" containerID="7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.960052 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ktjdc"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.963015 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.964760 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ktjdc"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.990899 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.994268 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.994536 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-catalog-content\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.994638 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.494566251 +0000 UTC m=+221.752105805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.995519 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-utilities\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.995893 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tk7f\" (UniqueName: \"kubernetes.io/projected/ac30245d-7e42-440c-99a0-60e2ae15cb8b-kube-api-access-8tk7f\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.006839 4781 scope.go:117] "RemoveContainer" containerID="7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710" Feb 27 00:09:12 crc kubenswrapper[4781]: E0227 00:09:12.007518 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710\": container with ID starting with 7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710 not found: ID does not exist" containerID="7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.007557 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710"} err="failed to get container status \"7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710\": rpc error: code = NotFound desc = could not find container \"7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710\": container with ID starting with 7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710 not found: ID does not exist" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.049927 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-52xgq"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.050945 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.089748 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52xgq"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.106847 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-catalog-content\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.107010 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.107107 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-utilities\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.107206 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tk7f\" (UniqueName: \"kubernetes.io/projected/ac30245d-7e42-440c-99a0-60e2ae15cb8b-kube-api-access-8tk7f\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.108010 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-catalog-content\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: E0227 00:09:12.108614 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.608601873 +0000 UTC m=+221.866141427 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.112592 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-utilities\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.137851 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tk7f\" (UniqueName: \"kubernetes.io/projected/ac30245d-7e42-440c-99a0-60e2ae15cb8b-kube-api-access-8tk7f\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.192093 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.209822 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.210015 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8558\" (UniqueName: \"kubernetes.io/projected/0f286d62-2145-4bbb-91eb-28ffda9b2494-kube-api-access-f8558\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.210061 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-catalog-content\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.210109 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-utilities\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: E0227 00:09:12.210208 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.710192669 +0000 UTC m=+221.967732223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.310964 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-catalog-content\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.311792 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-utilities\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.311883 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8558\" (UniqueName: \"kubernetes.io/projected/0f286d62-2145-4bbb-91eb-28ffda9b2494-kube-api-access-f8558\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.311911 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-catalog-content\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.311927 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: E0227 00:09:12.312227 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.812212395 +0000 UTC m=+222.069751949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.320515 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-utilities\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.331574 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-42hbx"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.343775 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8558\" (UniqueName: \"kubernetes.io/projected/0f286d62-2145-4bbb-91eb-28ffda9b2494-kube-api-access-f8558\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.360231 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kztqg"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.372281 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:12 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:12 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:12 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.372323 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:12 crc kubenswrapper[4781]: W0227 00:09:12.388229 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b050e9e_d6c8_4e27_ad3f_9681553c1539.slice/crio-f682c737bcb211243a2988ca17e566ea00c7e2d14bf78fba6f612945a62f66e6 WatchSource:0}: Error finding container f682c737bcb211243a2988ca17e566ea00c7e2d14bf78fba6f612945a62f66e6: Status 404 returned error can't find the container with id f682c737bcb211243a2988ca17e566ea00c7e2d14bf78fba6f612945a62f66e6 Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.412686 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:12 crc kubenswrapper[4781]: E0227 00:09:12.413127 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.913110745 +0000 UTC m=+222.170650289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.441199 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.463711 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-774845979b-t9755"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.464673 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.469989 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.470775 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.472413 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.472784 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.472903 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.473222 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.473504 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.473685 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.474866 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.474919 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kqrgb"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.475358 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.475491 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.475777 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.475882 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.476749 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.480712 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774845979b-t9755"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.482159 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.483770 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.514265 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: E0227 00:09:12.514670 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:13.01465467 +0000 UTC m=+222.272194224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.530611 4781 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-27T00:09:11.76226175Z","Handler":null,"Name":""} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.533950 4781 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.533979 4781 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625088 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625355 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-config\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-config\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625414 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjsgw\" (UniqueName: \"kubernetes.io/projected/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-kube-api-access-xjsgw\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625442 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3667d98-cf94-4751-8191-1d924ea13617-serving-cert\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625471 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-client-ca\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625496 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-proxy-ca-bundles\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625521 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-client-ca\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625536 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qk2n\" (UniqueName: \"kubernetes.io/projected/c3667d98-cf94-4751-8191-1d924ea13617-kube-api-access-4qk2n\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625572 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-serving-cert\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.636292 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729223 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-proxy-ca-bundles\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729284 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-client-ca\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729309 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qk2n\" (UniqueName: \"kubernetes.io/projected/c3667d98-cf94-4751-8191-1d924ea13617-kube-api-access-4qk2n\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729407 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-serving-cert\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729479 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-config\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729506 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-config\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729524 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjsgw\" (UniqueName: \"kubernetes.io/projected/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-kube-api-access-xjsgw\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729543 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729561 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3667d98-cf94-4751-8191-1d924ea13617-serving-cert\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729584 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-client-ca\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.730492 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-client-ca\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.731707 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-proxy-ca-bundles\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.732213 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-client-ca\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.734421 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-config\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.736876 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-config\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.738219 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-serving-cert\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.738244 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3667d98-cf94-4751-8191-1d924ea13617-serving-cert\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.738311 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.738342 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.747664 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjsgw\" (UniqueName: \"kubernetes.io/projected/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-kube-api-access-xjsgw\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.751084 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qk2n\" (UniqueName: \"kubernetes.io/projected/c3667d98-cf94-4751-8191-1d924ea13617-kube-api-access-4qk2n\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.753214 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52xgq"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.763139 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.788453 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.815853 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.854852 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.897176 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.897234 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.928392 4781 generic.go:334] "Generic (PLEG): container finished" podID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerID="0c5e0439f18997d1945f8c92f69edded31054471dc31175a4e23307895e84fc9" exitCode=0 Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.928690 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqrgb" event={"ID":"ac30245d-7e42-440c-99a0-60e2ae15cb8b","Type":"ContainerDied","Data":"0c5e0439f18997d1945f8c92f69edded31054471dc31175a4e23307895e84fc9"} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.928728 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqrgb" event={"ID":"ac30245d-7e42-440c-99a0-60e2ae15cb8b","Type":"ContainerStarted","Data":"ba66da6dc8bfa69982da2943397bfec42cd942427662c0a4732f24accf5f77a6"} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.936525 4781 generic.go:334] "Generic (PLEG): container finished" podID="19ed5401-2778-4266-8bf1-1c7244dac100" containerID="064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a" exitCode=0 Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.936768 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42hbx" event={"ID":"19ed5401-2778-4266-8bf1-1c7244dac100","Type":"ContainerDied","Data":"064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a"} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.936793 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42hbx" event={"ID":"19ed5401-2778-4266-8bf1-1c7244dac100","Type":"ContainerStarted","Data":"78b3df3f6b7f7425a9c2cd10f5b420e9f36ecb616bd533d5cfdfee3767475ccc"} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.957349 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" event={"ID":"26e75b38-be64-4f34-933f-731abfe217b6","Type":"ContainerStarted","Data":"557bd1bd32a3bf797dc2d98115a973ca7f23c121046b9a168d3bbca91df7a6d2"} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.960617 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52xgq" event={"ID":"0f286d62-2145-4bbb-91eb-28ffda9b2494","Type":"ContainerStarted","Data":"dc9d59b8ab934cad32f1842b836646a3832e9408664f5c6c345f309f196516de"} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.985904 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" podStartSLOduration=10.985890165 podStartE2EDuration="10.985890165s" podCreationTimestamp="2026-02-27 00:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:12.984867251 +0000 UTC m=+222.242406795" watchObservedRunningTime="2026-02-27 00:09:12.985890165 +0000 UTC m=+222.243429719" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.003892 4781 generic.go:334] "Generic (PLEG): container finished" podID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerID="d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd" exitCode=0 Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.003960 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kztqg" event={"ID":"2b050e9e-d6c8-4e27-ad3f-9681553c1539","Type":"ContainerDied","Data":"d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd"} Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.003988 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kztqg" event={"ID":"2b050e9e-d6c8-4e27-ad3f-9681553c1539","Type":"ContainerStarted","Data":"f682c737bcb211243a2988ca17e566ea00c7e2d14bf78fba6f612945a62f66e6"} Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.116991 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774845979b-t9755"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.158597 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tw95c"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.229492 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.265945 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.316971 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.317691 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a24423db-53f2-4555-81e4-228b3911e144" path="/var/lib/kubelet/pods/a24423db-53f2-4555-81e4-228b3911e144/volumes" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.318295 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" path="/var/lib/kubelet/pods/c7332c18-9748-49d2-b512-a46c2d1fcb79/volumes" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.367206 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:13 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:13 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:13 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.367247 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.453083 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9ngbg"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.454173 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.455591 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.461715 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ngbg"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.642613 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-utilities\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.642744 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mqv2\" (UniqueName: \"kubernetes.io/projected/baa593f3-06c4-461f-a893-609b07dfd282-kube-api-access-9mqv2\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.642811 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-catalog-content\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.728406 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.729118 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.730753 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.731741 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.741987 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.774172 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mqv2\" (UniqueName: \"kubernetes.io/projected/baa593f3-06c4-461f-a893-609b07dfd282-kube-api-access-9mqv2\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.774289 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-catalog-content\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.774763 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-catalog-content\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.774798 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-utilities\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.774876 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-utilities\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.807722 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mqv2\" (UniqueName: \"kubernetes.io/projected/baa593f3-06c4-461f-a893-609b07dfd282-kube-api-access-9mqv2\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.831967 4781 ???:1] "http: TLS handshake error from 192.168.126.11:56592: no serving certificate available for the kubelet" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.855975 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5rnj7"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.857031 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.873904 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rnj7"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.875874 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.875938 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.936673 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.937437 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.940030 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.940172 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.944097 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.977170 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-catalog-content\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.977219 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-utilities\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.977246 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.977278 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6frr\" (UniqueName: \"kubernetes.io/projected/97e44b43-3c8e-4065-a51b-aa3f27c36712-kube-api-access-w6frr\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.977300 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.977596 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.997337 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.011130 4781 generic.go:334] "Generic (PLEG): container finished" podID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerID="0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce" exitCode=0 Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.011185 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52xgq" event={"ID":"0f286d62-2145-4bbb-91eb-28ffda9b2494","Type":"ContainerDied","Data":"0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.014114 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" event={"ID":"c3667d98-cf94-4751-8191-1d924ea13617","Type":"ContainerStarted","Data":"2a320ba2160cdf180792174d7de2338c3325b28784d5051920447fb8b1241688"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.014209 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" event={"ID":"c3667d98-cf94-4751-8191-1d924ea13617","Type":"ContainerStarted","Data":"3d67897192f1eb6932753a86ce0f7bd6d344c09b54e771c58e76686037dd2268"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.014686 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.029999 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.031781 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774845979b-t9755" event={"ID":"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b","Type":"ContainerStarted","Data":"cca3038329bc9717a99aec45d98f04146a5502dd497a4438c9c5be74b15ed23a"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.031823 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774845979b-t9755" event={"ID":"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b","Type":"ContainerStarted","Data":"8475af139220f4e889d3615e00c27c5c3f916ced71896c2698d7d1d5d2f40792"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.032085 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.034943 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" event={"ID":"16339491-baee-42b5-82bb-07bca82a5f77","Type":"ContainerStarted","Data":"c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.034976 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" event={"ID":"16339491-baee-42b5-82bb-07bca82a5f77","Type":"ContainerStarted","Data":"baa2ed7e45a407c61fcadf3b6fb1abb2bf58b2f1863ead5f5bd18f0e92393602"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.035025 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.039077 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.050730 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" podStartSLOduration=4.050714036 podStartE2EDuration="4.050714036s" podCreationTimestamp="2026-02-27 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:14.048381703 +0000 UTC m=+223.305921257" watchObservedRunningTime="2026-02-27 00:09:14.050714036 +0000 UTC m=+223.308253590" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.076212 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.077986 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.079769 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd741f12-8908-4f25-a2ab-2a9deb826494-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.079887 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-catalog-content\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.079944 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-utilities\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.080051 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6frr\" (UniqueName: \"kubernetes.io/projected/97e44b43-3c8e-4065-a51b-aa3f27c36712-kube-api-access-w6frr\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.080069 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd741f12-8908-4f25-a2ab-2a9deb826494-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.080399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-catalog-content\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.081508 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-utilities\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.082331 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.093834 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.104364 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6frr\" (UniqueName: \"kubernetes.io/projected/97e44b43-3c8e-4065-a51b-aa3f27c36712-kube-api-access-w6frr\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.127502 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" podStartSLOduration=168.127463361 podStartE2EDuration="2m48.127463361s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:14.101868913 +0000 UTC m=+223.359408487" watchObservedRunningTime="2026-02-27 00:09:14.127463361 +0000 UTC m=+223.385002915" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.157210 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-774845979b-t9755" podStartSLOduration=4.157165554 podStartE2EDuration="4.157165554s" podCreationTimestamp="2026-02-27 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:14.150392398 +0000 UTC m=+223.407931952" watchObservedRunningTime="2026-02-27 00:09:14.157165554 +0000 UTC m=+223.414705108" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.177073 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.183880 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd741f12-8908-4f25-a2ab-2a9deb826494-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.184046 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd741f12-8908-4f25-a2ab-2a9deb826494-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.185090 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd741f12-8908-4f25-a2ab-2a9deb826494-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.209959 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.210012 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.211762 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.226945 4781 patch_prober.go:28] interesting pod/console-f9d7485db-vtsxv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.226999 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vtsxv" podUID="76705148-274c-4428-9508-13fe1193646e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.231470 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd741f12-8908-4f25-a2ab-2a9deb826494-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.279050 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.372422 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:14 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:14 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:14 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.372829 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.670061 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hcdz5"] Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.671036 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.677356 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.743244 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hcdz5"] Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.796194 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ngbg"] Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.806457 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztvqm\" (UniqueName: \"kubernetes.io/projected/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-kube-api-access-ztvqm\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.806548 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-catalog-content\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.806593 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-utilities\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: W0227 00:09:14.828567 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaa593f3_06c4_461f_a893_609b07dfd282.slice/crio-9502c5ad99503e1096d0070d626245f0844a912a2ffc6a125931ca6764817da5 WatchSource:0}: Error finding container 9502c5ad99503e1096d0070d626245f0844a912a2ffc6a125931ca6764817da5: Status 404 returned error can't find the container with id 9502c5ad99503e1096d0070d626245f0844a912a2ffc6a125931ca6764817da5 Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.847087 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rnj7"] Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.907379 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztvqm\" (UniqueName: \"kubernetes.io/projected/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-kube-api-access-ztvqm\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.907467 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-catalog-content\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.907497 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-utilities\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.908031 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-utilities\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.908489 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-catalog-content\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.951061 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztvqm\" (UniqueName: \"kubernetes.io/projected/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-kube-api-access-ztvqm\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.955127 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.955188 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.955256 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.955294 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.003189 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.048204 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rnj7" event={"ID":"97e44b43-3c8e-4065-a51b-aa3f27c36712","Type":"ContainerStarted","Data":"b4c78b3d5964c2a730f268fed158cc29cd746663976e63644c0b8dcc232f4b12"} Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.065708 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ngbg" event={"ID":"baa593f3-06c4-461f-a893-609b07dfd282","Type":"ContainerStarted","Data":"9502c5ad99503e1096d0070d626245f0844a912a2ffc6a125931ca6764817da5"} Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.091522 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dj7h5"] Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.107731 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dj7h5"] Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.107830 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.132362 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.172578 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.216841 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z44vv\" (UniqueName: \"kubernetes.io/projected/514049ae-2568-416f-9705-524c2bf74cbd-kube-api-access-z44vv\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.217281 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-catalog-content\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.217442 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-utilities\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: W0227 00:09:15.235305 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9a90341e_86fb_4819_848b_cdd71b0ac0a7.slice/crio-7bbdbb00ad1bfd6f9360f74b3ba833fdc090e22888962d9a0ed6331a2064890b WatchSource:0}: Error finding container 7bbdbb00ad1bfd6f9360f74b3ba833fdc090e22888962d9a0ed6331a2064890b: Status 404 returned error can't find the container with id 7bbdbb00ad1bfd6f9360f74b3ba833fdc090e22888962d9a0ed6331a2064890b Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.327549 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-utilities\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.327710 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z44vv\" (UniqueName: \"kubernetes.io/projected/514049ae-2568-416f-9705-524c2bf74cbd-kube-api-access-z44vv\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.327781 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-catalog-content\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.328555 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-catalog-content\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.328587 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-utilities\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.356557 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z44vv\" (UniqueName: \"kubernetes.io/projected/514049ae-2568-416f-9705-524c2bf74cbd-kube-api-access-z44vv\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.370035 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:15 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:15 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:15 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.370083 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.377495 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.536910 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.615310 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hcdz5"] Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.923326 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dj7h5"] Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.082078 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd741f12-8908-4f25-a2ab-2a9deb826494","Type":"ContainerStarted","Data":"c3a493a37405ad435f02cc17eedb2fb9132690911ac52fc757e13532a2d8192b"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.082563 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd741f12-8908-4f25-a2ab-2a9deb826494","Type":"ContainerStarted","Data":"bba3576a0eb52065bd913ed89976d8f6d85c179f2826194a815d95093997aef7"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.088760 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9a90341e-86fb-4819-848b-cdd71b0ac0a7","Type":"ContainerStarted","Data":"4d228af20df66a57dfbd572426eaea07b15759a2c69b3a41b9d87c2e34efb05c"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.088804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9a90341e-86fb-4819-848b-cdd71b0ac0a7","Type":"ContainerStarted","Data":"7bbdbb00ad1bfd6f9360f74b3ba833fdc090e22888962d9a0ed6331a2064890b"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.093270 4781 generic.go:334] "Generic (PLEG): container finished" podID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerID="b414a361ce30e28fdc5bc47f53f766e6427e2ccb8cfe76be4eed8ce4ee48ebca" exitCode=0 Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.093345 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rnj7" event={"ID":"97e44b43-3c8e-4065-a51b-aa3f27c36712","Type":"ContainerDied","Data":"b414a361ce30e28fdc5bc47f53f766e6427e2ccb8cfe76be4eed8ce4ee48ebca"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.097538 4781 generic.go:334] "Generic (PLEG): container finished" podID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerID="a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a" exitCode=0 Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.097844 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerDied","Data":"a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.097876 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerStarted","Data":"6d980a6fc9de180882f2ee8cc193af0d7ab5d1ba875bfb8da4f55cc14f767f69"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.108165 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.108148982 podStartE2EDuration="3.108148982s" podCreationTimestamp="2026-02-27 00:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:16.100591048 +0000 UTC m=+225.358130602" watchObservedRunningTime="2026-02-27 00:09:16.108148982 +0000 UTC m=+225.365688536" Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.127830 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.127810744 podStartE2EDuration="3.127810744s" podCreationTimestamp="2026-02-27 00:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:16.127648551 +0000 UTC m=+225.385188105" watchObservedRunningTime="2026-02-27 00:09:16.127810744 +0000 UTC m=+225.385350298" Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.135846 4781 generic.go:334] "Generic (PLEG): container finished" podID="678f27fc-d210-4a4f-bd73-090378740da9" containerID="898ccef1da25e7c00fcd11040419fe4b505ada16cb26d62d9a4806872cb68348" exitCode=0 Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.135882 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" event={"ID":"678f27fc-d210-4a4f-bd73-090378740da9","Type":"ContainerDied","Data":"898ccef1da25e7c00fcd11040419fe4b505ada16cb26d62d9a4806872cb68348"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.152808 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerStarted","Data":"45d5d509e8ad0dc50e09ff3936cc7a26189c6c645b18672248f0a72722749ca4"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.162393 4781 generic.go:334] "Generic (PLEG): container finished" podID="baa593f3-06c4-461f-a893-609b07dfd282" containerID="eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91" exitCode=0 Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.162900 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ngbg" event={"ID":"baa593f3-06c4-461f-a893-609b07dfd282","Type":"ContainerDied","Data":"eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.373675 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.377098 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:17 crc kubenswrapper[4781]: I0227 00:09:17.187531 4781 generic.go:334] "Generic (PLEG): container finished" podID="514049ae-2568-416f-9705-524c2bf74cbd" containerID="39f26f7fa9552ef0082d4338be84e32dc690ddb73a7ed4be83f09421026f56c7" exitCode=0 Feb 27 00:09:17 crc kubenswrapper[4781]: I0227 00:09:17.187618 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerDied","Data":"39f26f7fa9552ef0082d4338be84e32dc690ddb73a7ed4be83f09421026f56c7"} Feb 27 00:09:17 crc kubenswrapper[4781]: I0227 00:09:17.965701 4781 generic.go:334] "Generic (PLEG): container finished" podID="dd741f12-8908-4f25-a2ab-2a9deb826494" containerID="c3a493a37405ad435f02cc17eedb2fb9132690911ac52fc757e13532a2d8192b" exitCode=0 Feb 27 00:09:17 crc kubenswrapper[4781]: I0227 00:09:17.965780 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd741f12-8908-4f25-a2ab-2a9deb826494","Type":"ContainerDied","Data":"c3a493a37405ad435f02cc17eedb2fb9132690911ac52fc757e13532a2d8192b"} Feb 27 00:09:18 crc kubenswrapper[4781]: I0227 00:09:18.277413 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:09:18 crc kubenswrapper[4781]: I0227 00:09:18.278342 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:18 crc kubenswrapper[4781]: I0227 00:09:18.278449 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:18 crc kubenswrapper[4781]: I0227 00:09:18.278543 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:09:18 crc kubenswrapper[4781]: I0227 00:09:18.290985 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.035644 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.049930 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.050331 4781 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fdkct container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.050368 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" podUID="e1c9b213-8c36-4ecf-831f-69a912f6364f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.062548 4781 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fdkct container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded" start-of-body= Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.062853 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" podUID="e1c9b213-8c36-4ecf-831f-69a912f6364f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.156801 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.185842 4781 generic.go:334] "Generic (PLEG): container finished" podID="9a90341e-86fb-4819-848b-cdd71b0ac0a7" containerID="4d228af20df66a57dfbd572426eaea07b15759a2c69b3a41b9d87c2e34efb05c" exitCode=0 Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.217306 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.234794 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.235448 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:21 crc kubenswrapper[4781]: E0227 00:09:21.236540 4781 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.269s" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.277120 4781 ???:1] "http: TLS handshake error from 192.168.126.11:58336: no serving certificate available for the kubelet" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.357660 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.357711 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9a90341e-86fb-4819-848b-cdd71b0ac0a7","Type":"ContainerDied","Data":"4d228af20df66a57dfbd572426eaea07b15759a2c69b3a41b9d87c2e34efb05c"} Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.627791 4781 ???:1] "http: TLS handshake error from 192.168.126.11:58342: no serving certificate available for the kubelet" Feb 27 00:09:24 crc kubenswrapper[4781]: I0227 00:09:24.291105 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:24 crc kubenswrapper[4781]: I0227 00:09:24.296107 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:24 crc kubenswrapper[4781]: I0227 00:09:24.955051 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:24 crc kubenswrapper[4781]: I0227 00:09:24.955130 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:24 crc kubenswrapper[4781]: I0227 00:09:24.955227 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:24 crc kubenswrapper[4781]: I0227 00:09:24.955252 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:29 crc kubenswrapper[4781]: I0227 00:09:29.705925 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774845979b-t9755"] Feb 27 00:09:29 crc kubenswrapper[4781]: I0227 00:09:29.706770 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-774845979b-t9755" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerName="controller-manager" containerID="cri-o://cca3038329bc9717a99aec45d98f04146a5502dd497a4438c9c5be74b15ed23a" gracePeriod=30 Feb 27 00:09:29 crc kubenswrapper[4781]: I0227 00:09:29.726279 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf"] Feb 27 00:09:29 crc kubenswrapper[4781]: I0227 00:09:29.726653 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" podUID="c3667d98-cf94-4751-8191-1d924ea13617" containerName="route-controller-manager" containerID="cri-o://2a320ba2160cdf180792174d7de2338c3325b28784d5051920447fb8b1241688" gracePeriod=30 Feb 27 00:09:31 crc kubenswrapper[4781]: I0227 00:09:31.561675 4781 ???:1] "http: TLS handshake error from 192.168.126.11:45998: no serving certificate available for the kubelet" Feb 27 00:09:32 crc kubenswrapper[4781]: I0227 00:09:32.789830 4781 patch_prober.go:28] interesting pod/controller-manager-774845979b-t9755 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Feb 27 00:09:32 crc kubenswrapper[4781]: I0227 00:09:32.789946 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-774845979b-t9755" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Feb 27 00:09:32 crc kubenswrapper[4781]: I0227 00:09:32.816876 4781 patch_prober.go:28] interesting pod/route-controller-manager-8d8c487b-4kknf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Feb 27 00:09:32 crc kubenswrapper[4781]: I0227 00:09:32.816925 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" podUID="c3667d98-cf94-4751-8191-1d924ea13617" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Feb 27 00:09:32 crc kubenswrapper[4781]: I0227 00:09:32.861723 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:33 crc kubenswrapper[4781]: I0227 00:09:33.280923 4781 generic.go:334] "Generic (PLEG): container finished" podID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerID="cca3038329bc9717a99aec45d98f04146a5502dd497a4438c9c5be74b15ed23a" exitCode=0 Feb 27 00:09:33 crc kubenswrapper[4781]: I0227 00:09:33.281009 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774845979b-t9755" event={"ID":"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b","Type":"ContainerDied","Data":"cca3038329bc9717a99aec45d98f04146a5502dd497a4438c9c5be74b15ed23a"} Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.211916 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.214714 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.232365 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.436643 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.445188 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.956130 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.956173 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.956237 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.956253 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.956322 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.957257 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"a7c8f7063bafc919361346a9dc1315b92859e40920e9b450653ad64294dfaf96"} pod="openshift-console/downloads-7954f5f757-qjwrj" containerMessage="Container download-server failed liveness probe, will be restarted" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.957244 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.957329 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" containerID="cri-o://a7c8f7063bafc919361346a9dc1315b92859e40920e9b450653ad64294dfaf96" gracePeriod=2 Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.957678 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:39 crc kubenswrapper[4781]: I0227 00:09:39.318289 4781 generic.go:334] "Generic (PLEG): container finished" podID="c3667d98-cf94-4751-8191-1d924ea13617" containerID="2a320ba2160cdf180792174d7de2338c3325b28784d5051920447fb8b1241688" exitCode=0 Feb 27 00:09:39 crc kubenswrapper[4781]: I0227 00:09:39.318360 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" event={"ID":"c3667d98-cf94-4751-8191-1d924ea13617","Type":"ContainerDied","Data":"2a320ba2160cdf180792174d7de2338c3325b28784d5051920447fb8b1241688"} Feb 27 00:09:40 crc kubenswrapper[4781]: I0227 00:09:40.326283 4781 generic.go:334] "Generic (PLEG): container finished" podID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerID="a7c8f7063bafc919361346a9dc1315b92859e40920e9b450653ad64294dfaf96" exitCode=0 Feb 27 00:09:40 crc kubenswrapper[4781]: I0227 00:09:40.326342 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qjwrj" event={"ID":"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c","Type":"ContainerDied","Data":"a7c8f7063bafc919361346a9dc1315b92859e40920e9b450653ad64294dfaf96"} Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.341892 4781 generic.go:334] "Generic (PLEG): container finished" podID="91e2c481-01ee-461f-bc5b-d09b7ea221c5" containerID="34034ef1e924a05fbc92daf60e2f0c105f332a30b0fe9cea72b0da3d3065e13e" exitCode=0 Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.342091 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29535840-t9tlz" event={"ID":"91e2c481-01ee-461f-bc5b-d09b7ea221c5","Type":"ContainerDied","Data":"34034ef1e924a05fbc92daf60e2f0c105f332a30b0fe9cea72b0da3d3065e13e"} Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.789579 4781 patch_prober.go:28] interesting pod/controller-manager-774845979b-t9755 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.789670 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-774845979b-t9755" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.817184 4781 patch_prober.go:28] interesting pod/route-controller-manager-8d8c487b-4kknf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.817250 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" podUID="c3667d98-cf94-4751-8191-1d924ea13617" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.895587 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.895661 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.669214 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.676312 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.717450 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.722035 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.776952 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/678f27fc-d210-4a4f-bd73-090378740da9-secret-volume\") pod \"678f27fc-d210-4a4f-bd73-090378740da9\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777083 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kube-api-access\") pod \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777116 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/678f27fc-d210-4a4f-bd73-090378740da9-config-volume\") pod \"678f27fc-d210-4a4f-bd73-090378740da9\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777150 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kubelet-dir\") pod \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777179 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91e2c481-01ee-461f-bc5b-d09b7ea221c5-serviceca\") pod \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777227 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-856kl\" (UniqueName: \"kubernetes.io/projected/91e2c481-01ee-461f-bc5b-d09b7ea221c5-kube-api-access-856kl\") pod \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777266 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95hgc\" (UniqueName: \"kubernetes.io/projected/678f27fc-d210-4a4f-bd73-090378740da9-kube-api-access-95hgc\") pod \"678f27fc-d210-4a4f-bd73-090378740da9\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777786 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9a90341e-86fb-4819-848b-cdd71b0ac0a7" (UID: "9a90341e-86fb-4819-848b-cdd71b0ac0a7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.780097 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/678f27fc-d210-4a4f-bd73-090378740da9-config-volume" (OuterVolumeSpecName: "config-volume") pod "678f27fc-d210-4a4f-bd73-090378740da9" (UID: "678f27fc-d210-4a4f-bd73-090378740da9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.780514 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e2c481-01ee-461f-bc5b-d09b7ea221c5-serviceca" (OuterVolumeSpecName: "serviceca") pod "91e2c481-01ee-461f-bc5b-d09b7ea221c5" (UID: "91e2c481-01ee-461f-bc5b-d09b7ea221c5"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.787244 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/678f27fc-d210-4a4f-bd73-090378740da9-kube-api-access-95hgc" (OuterVolumeSpecName: "kube-api-access-95hgc") pod "678f27fc-d210-4a4f-bd73-090378740da9" (UID: "678f27fc-d210-4a4f-bd73-090378740da9"). InnerVolumeSpecName "kube-api-access-95hgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.787358 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/678f27fc-d210-4a4f-bd73-090378740da9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "678f27fc-d210-4a4f-bd73-090378740da9" (UID: "678f27fc-d210-4a4f-bd73-090378740da9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.789181 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e2c481-01ee-461f-bc5b-d09b7ea221c5-kube-api-access-856kl" (OuterVolumeSpecName: "kube-api-access-856kl") pod "91e2c481-01ee-461f-bc5b-d09b7ea221c5" (UID: "91e2c481-01ee-461f-bc5b-d09b7ea221c5"). InnerVolumeSpecName "kube-api-access-856kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.799949 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9a90341e-86fb-4819-848b-cdd71b0ac0a7" (UID: "9a90341e-86fb-4819-848b-cdd71b0ac0a7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.879808 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd741f12-8908-4f25-a2ab-2a9deb826494-kube-api-access\") pod \"dd741f12-8908-4f25-a2ab-2a9deb826494\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.879898 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd741f12-8908-4f25-a2ab-2a9deb826494-kubelet-dir\") pod \"dd741f12-8908-4f25-a2ab-2a9deb826494\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880143 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/678f27fc-d210-4a4f-bd73-090378740da9-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880154 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880165 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880173 4781 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91e2c481-01ee-461f-bc5b-d09b7ea221c5-serviceca\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880181 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-856kl\" (UniqueName: \"kubernetes.io/projected/91e2c481-01ee-461f-bc5b-d09b7ea221c5-kube-api-access-856kl\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880191 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95hgc\" (UniqueName: \"kubernetes.io/projected/678f27fc-d210-4a4f-bd73-090378740da9-kube-api-access-95hgc\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880199 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/678f27fc-d210-4a4f-bd73-090378740da9-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880238 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd741f12-8908-4f25-a2ab-2a9deb826494-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dd741f12-8908-4f25-a2ab-2a9deb826494" (UID: "dd741f12-8908-4f25-a2ab-2a9deb826494"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.884248 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd741f12-8908-4f25-a2ab-2a9deb826494-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dd741f12-8908-4f25-a2ab-2a9deb826494" (UID: "dd741f12-8908-4f25-a2ab-2a9deb826494"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.981330 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd741f12-8908-4f25-a2ab-2a9deb826494-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.981360 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd741f12-8908-4f25-a2ab-2a9deb826494-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.357904 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" event={"ID":"678f27fc-d210-4a4f-bd73-090378740da9","Type":"ContainerDied","Data":"8e97fd8fcdef99a06975af07b11d983d49d1856c8a620f0853e184ef575d88e1"} Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.358296 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e97fd8fcdef99a06975af07b11d983d49d1856c8a620f0853e184ef575d88e1" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.357958 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.362148 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.362152 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd741f12-8908-4f25-a2ab-2a9deb826494","Type":"ContainerDied","Data":"bba3576a0eb52065bd913ed89976d8f6d85c179f2826194a815d95093997aef7"} Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.362238 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bba3576a0eb52065bd913ed89976d8f6d85c179f2826194a815d95093997aef7" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.364197 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9a90341e-86fb-4819-848b-cdd71b0ac0a7","Type":"ContainerDied","Data":"7bbdbb00ad1bfd6f9360f74b3ba833fdc090e22888962d9a0ed6331a2064890b"} Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.364224 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bbdbb00ad1bfd6f9360f74b3ba833fdc090e22888962d9a0ed6331a2064890b" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.364264 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.365612 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29535840-t9tlz" event={"ID":"91e2c481-01ee-461f-bc5b-d09b7ea221c5","Type":"ContainerDied","Data":"02350f41c01977124604e142f885201d5743582263439e32be7f03871d0f9773"} Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.365665 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02350f41c01977124604e142f885201d5743582263439e32be7f03871d0f9773" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.365725 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:44 crc kubenswrapper[4781]: E0227 00:09:44.919772 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 00:09:44 crc kubenswrapper[4781]: E0227 00:09:44.920336 4781 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 00:09:44 crc kubenswrapper[4781]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 00:09:44 crc kubenswrapper[4781]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mv9hp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29535848-ccctv_openshift-infra(df035290-8e3c-422b-90ac-573b592defcf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 27 00:09:44 crc kubenswrapper[4781]: > logger="UnhandledError" Feb 27 00:09:44 crc kubenswrapper[4781]: E0227 00:09:44.922438 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29535848-ccctv" podUID="df035290-8e3c-422b-90ac-573b592defcf" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.955102 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.955179 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.189569 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.327737 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.334328 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.374908 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.374939 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" event={"ID":"c3667d98-cf94-4751-8191-1d924ea13617","Type":"ContainerDied","Data":"3d67897192f1eb6932753a86ce0f7bd6d344c09b54e771c58e76686037dd2268"} Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.375023 4781 scope.go:117] "RemoveContainer" containerID="2a320ba2160cdf180792174d7de2338c3325b28784d5051920447fb8b1241688" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.378200 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.378374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774845979b-t9755" event={"ID":"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b","Type":"ContainerDied","Data":"8475af139220f4e889d3615e00c27c5c3f916ced71896c2698d7d1d5d2f40792"} Feb 27 00:09:45 crc kubenswrapper[4781]: E0227 00:09:45.379002 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29535848-ccctv" podUID="df035290-8e3c-422b-90ac-573b592defcf" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407495 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-client-ca\") pod \"c3667d98-cf94-4751-8191-1d924ea13617\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407553 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-serving-cert\") pod \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407588 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3667d98-cf94-4751-8191-1d924ea13617-serving-cert\") pod \"c3667d98-cf94-4751-8191-1d924ea13617\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407718 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-config\") pod \"c3667d98-cf94-4751-8191-1d924ea13617\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407845 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qk2n\" (UniqueName: \"kubernetes.io/projected/c3667d98-cf94-4751-8191-1d924ea13617-kube-api-access-4qk2n\") pod \"c3667d98-cf94-4751-8191-1d924ea13617\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407881 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-proxy-ca-bundles\") pod \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407916 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjsgw\" (UniqueName: \"kubernetes.io/projected/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-kube-api-access-xjsgw\") pod \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407940 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-config\") pod \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407964 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-client-ca\") pod \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.408299 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-client-ca" (OuterVolumeSpecName: "client-ca") pod "c3667d98-cf94-4751-8191-1d924ea13617" (UID: "c3667d98-cf94-4751-8191-1d924ea13617"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.409050 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-config" (OuterVolumeSpecName: "config") pod "c3667d98-cf94-4751-8191-1d924ea13617" (UID: "c3667d98-cf94-4751-8191-1d924ea13617"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.409067 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" (UID: "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.409138 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-client-ca" (OuterVolumeSpecName: "client-ca") pod "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" (UID: "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.409153 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-config" (OuterVolumeSpecName: "config") pod "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" (UID: "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.414824 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3667d98-cf94-4751-8191-1d924ea13617-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c3667d98-cf94-4751-8191-1d924ea13617" (UID: "c3667d98-cf94-4751-8191-1d924ea13617"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.414851 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-kube-api-access-xjsgw" (OuterVolumeSpecName: "kube-api-access-xjsgw") pod "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" (UID: "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b"). InnerVolumeSpecName "kube-api-access-xjsgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.417264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3667d98-cf94-4751-8191-1d924ea13617-kube-api-access-4qk2n" (OuterVolumeSpecName: "kube-api-access-4qk2n") pod "c3667d98-cf94-4751-8191-1d924ea13617" (UID: "c3667d98-cf94-4751-8191-1d924ea13617"). InnerVolumeSpecName "kube-api-access-4qk2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.418798 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" (UID: "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509330 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509364 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509374 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3667d98-cf94-4751-8191-1d924ea13617-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509386 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509396 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qk2n\" (UniqueName: \"kubernetes.io/projected/c3667d98-cf94-4751-8191-1d924ea13617-kube-api-access-4qk2n\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509407 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509418 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjsgw\" (UniqueName: \"kubernetes.io/projected/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-kube-api-access-xjsgw\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509427 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509434 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.702209 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf"] Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.706526 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf"] Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.712284 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774845979b-t9755"] Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.715494 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-774845979b-t9755"] Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089410 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d"] Feb 27 00:09:46 crc kubenswrapper[4781]: E0227 00:09:46.089751 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a90341e-86fb-4819-848b-cdd71b0ac0a7" containerName="pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089764 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a90341e-86fb-4819-848b-cdd71b0ac0a7" containerName="pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: E0227 00:09:46.089773 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerName="controller-manager" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089780 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerName="controller-manager" Feb 27 00:09:46 crc kubenswrapper[4781]: E0227 00:09:46.089792 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd741f12-8908-4f25-a2ab-2a9deb826494" containerName="pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089798 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd741f12-8908-4f25-a2ab-2a9deb826494" containerName="pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: E0227 00:09:46.089807 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3667d98-cf94-4751-8191-1d924ea13617" containerName="route-controller-manager" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089812 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3667d98-cf94-4751-8191-1d924ea13617" containerName="route-controller-manager" Feb 27 00:09:46 crc kubenswrapper[4781]: E0227 00:09:46.089823 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e2c481-01ee-461f-bc5b-d09b7ea221c5" containerName="image-pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089828 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e2c481-01ee-461f-bc5b-d09b7ea221c5" containerName="image-pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: E0227 00:09:46.089842 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678f27fc-d210-4a4f-bd73-090378740da9" containerName="collect-profiles" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089847 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="678f27fc-d210-4a4f-bd73-090378740da9" containerName="collect-profiles" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090037 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="678f27fc-d210-4a4f-bd73-090378740da9" containerName="collect-profiles" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090049 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerName="controller-manager" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090062 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd741f12-8908-4f25-a2ab-2a9deb826494" containerName="pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090070 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a90341e-86fb-4819-848b-cdd71b0ac0a7" containerName="pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090078 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e2c481-01ee-461f-bc5b-d09b7ea221c5" containerName="image-pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090092 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3667d98-cf94-4751-8191-1d924ea13617" containerName="route-controller-manager" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090790 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.093459 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.094074 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.094154 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.094151 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.094318 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.094106 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.114707 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5cf657794c-phnhf"] Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.116019 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.121789 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cf657794c-phnhf"] Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.123162 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.123271 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.123316 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.123345 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.123422 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.125278 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d"] Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.129385 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.131367 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220276 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-client-ca\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220353 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vtpb\" (UniqueName: \"kubernetes.io/projected/16dfdff5-f774-4b57-adcf-587eb1a87012-kube-api-access-2vtpb\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220437 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-config\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220457 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-config\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220481 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpsw7\" (UniqueName: \"kubernetes.io/projected/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-kube-api-access-qpsw7\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220502 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dfdff5-f774-4b57-adcf-587eb1a87012-serving-cert\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220520 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-proxy-ca-bundles\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-client-ca\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220595 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-serving-cert\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321513 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-client-ca\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321553 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-serving-cert\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321585 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-client-ca\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321642 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vtpb\" (UniqueName: \"kubernetes.io/projected/16dfdff5-f774-4b57-adcf-587eb1a87012-kube-api-access-2vtpb\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321677 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-config\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321694 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-config\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321712 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpsw7\" (UniqueName: \"kubernetes.io/projected/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-kube-api-access-qpsw7\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321733 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dfdff5-f774-4b57-adcf-587eb1a87012-serving-cert\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321750 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-proxy-ca-bundles\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.322836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-proxy-ca-bundles\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.323363 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-client-ca\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.323730 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-config\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.324326 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-config\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.325915 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-client-ca\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.328659 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-serving-cert\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.329084 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dfdff5-f774-4b57-adcf-587eb1a87012-serving-cert\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.339073 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vtpb\" (UniqueName: \"kubernetes.io/projected/16dfdff5-f774-4b57-adcf-587eb1a87012-kube-api-access-2vtpb\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.343112 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpsw7\" (UniqueName: \"kubernetes.io/projected/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-kube-api-access-qpsw7\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.416023 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.435621 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:47 crc kubenswrapper[4781]: I0227 00:09:47.316948 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" path="/var/lib/kubelet/pods/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b/volumes" Feb 27 00:09:47 crc kubenswrapper[4781]: I0227 00:09:47.317765 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3667d98-cf94-4751-8191-1d924ea13617" path="/var/lib/kubelet/pods/c3667d98-cf94-4751-8191-1d924ea13617/volumes" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.265109 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.266265 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.268898 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.268948 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.274055 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.351214 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180f65d9-1cb5-411b-a031-6f97c06811d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.351345 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180f65d9-1cb5-411b-a031-6f97c06811d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.453071 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180f65d9-1cb5-411b-a031-6f97c06811d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.453180 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180f65d9-1cb5-411b-a031-6f97c06811d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.453262 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180f65d9-1cb5-411b-a031-6f97c06811d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.470644 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180f65d9-1cb5-411b-a031-6f97c06811d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.592899 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:49 crc kubenswrapper[4781]: I0227 00:09:49.654731 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cf657794c-phnhf"] Feb 27 00:09:49 crc kubenswrapper[4781]: I0227 00:09:49.754585 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d"] Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.656372 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.657287 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.672313 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.710911 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.710962 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c8795e9-9244-4cc4-a297-3aec68bf3588-kube-api-access\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.711004 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-var-lock\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.813139 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c8795e9-9244-4cc4-a297-3aec68bf3588-kube-api-access\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.813213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-var-lock\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.813295 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.813360 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.813411 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-var-lock\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.852390 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c8795e9-9244-4cc4-a297-3aec68bf3588-kube-api-access\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.981949 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:54 crc kubenswrapper[4781]: I0227 00:09:54.955255 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:54 crc kubenswrapper[4781]: I0227 00:09:54.956760 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.363710 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.363867 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zpnxw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kztqg_openshift-marketplace(2b050e9e-d6c8-4e27-ad3f-9681553c1539): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.365080 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kztqg" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.379268 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.379405 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f8558,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-52xgq_openshift-marketplace(0f286d62-2145-4bbb-91eb-28ffda9b2494): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.380597 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-52xgq" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.382817 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.383017 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mqv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9ngbg_openshift-marketplace(baa593f3-06c4-461f-a893-609b07dfd282): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.384324 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9ngbg" podUID="baa593f3-06c4-461f-a893-609b07dfd282" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.413133 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.413301 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ztvqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hcdz5_openshift-marketplace(a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.414671 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hcdz5" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" Feb 27 00:09:56 crc kubenswrapper[4781]: I0227 00:09:56.875774 4781 scope.go:117] "RemoveContainer" containerID="cca3038329bc9717a99aec45d98f04146a5502dd497a4438c9c5be74b15ed23a" Feb 27 00:09:56 crc kubenswrapper[4781]: W0227 00:09:56.882736 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-3491d7e1df67d43284793d5f6326675b7d0fa2567f1c03fbea4368ba2185e97e WatchSource:0}: Error finding container 3491d7e1df67d43284793d5f6326675b7d0fa2567f1c03fbea4368ba2185e97e: Status 404 returned error can't find the container with id 3491d7e1df67d43284793d5f6326675b7d0fa2567f1c03fbea4368ba2185e97e Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.899047 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hcdz5" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.899047 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kztqg" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.899072 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9ngbg" podUID="baa593f3-06c4-461f-a893-609b07dfd282" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.899118 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-52xgq" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" Feb 27 00:09:56 crc kubenswrapper[4781]: W0227 00:09:56.925684 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-7a34eb1a5ac66ea850a359793ec1389d7e0143f96eeca72531294b980e93a4ab WatchSource:0}: Error finding container 7a34eb1a5ac66ea850a359793ec1389d7e0143f96eeca72531294b980e93a4ab: Status 404 returned error can't find the container with id 7a34eb1a5ac66ea850a359793ec1389d7e0143f96eeca72531294b980e93a4ab Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.961611 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.961798 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xgndh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-42hbx_openshift-marketplace(19ed5401-2778-4266-8bf1-1c7244dac100): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.962954 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-42hbx" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.980712 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.980883 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8tk7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-kqrgb_openshift-marketplace(ac30245d-7e42-440c-99a0-60e2ae15cb8b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.985239 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-kqrgb" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.074279 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpnjj"] Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.373239 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 00:09:57 crc kubenswrapper[4781]: W0227 00:09:57.387399 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7c8795e9_9244_4cc4_a297_3aec68bf3588.slice/crio-f39f3390eb5e42a403a575333d110cbe5ece9b7617819b5c4f74d934848ba9f2 WatchSource:0}: Error finding container f39f3390eb5e42a403a575333d110cbe5ece9b7617819b5c4f74d934848ba9f2: Status 404 returned error can't find the container with id f39f3390eb5e42a403a575333d110cbe5ece9b7617819b5c4f74d934848ba9f2 Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.447487 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.452722 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qjwrj" event={"ID":"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c","Type":"ContainerStarted","Data":"4aa0dd804d65361572e088d045b1114ad255cfdb61d9c66d8a70600e9afb7537"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.458317 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.458409 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.458446 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.460555 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d"] Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.470149 4781 generic.go:334] "Generic (PLEG): container finished" podID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerID="9c12fe8df9037297d8af4eedba0d4e04fa1c5be02d943f1b25318346033b7fc9" exitCode=0 Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.470391 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rnj7" event={"ID":"97e44b43-3c8e-4065-a51b-aa3f27c36712","Type":"ContainerDied","Data":"9c12fe8df9037297d8af4eedba0d4e04fa1c5be02d943f1b25318346033b7fc9"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.473506 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8e08f49816417f62f0f1608baa02644560149e4178f81c7d3d13162bc75dabde"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.473700 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7a34eb1a5ac66ea850a359793ec1389d7e0143f96eeca72531294b980e93a4ab"} Feb 27 00:09:57 crc kubenswrapper[4781]: W0227 00:09:57.476680 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52bc89a7_b3b5_4ab5_ad64_4df7cd38b1b9.slice/crio-afebd9373c32a2481ceafebd85a9c76dfa968abcad4e05173bfa83af78d85aa8 WatchSource:0}: Error finding container afebd9373c32a2481ceafebd85a9c76dfa968abcad4e05173bfa83af78d85aa8: Status 404 returned error can't find the container with id afebd9373c32a2481ceafebd85a9c76dfa968abcad4e05173bfa83af78d85aa8 Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.490741 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerStarted","Data":"1701a618ae78c1968b5098401e32f2c349b0a0fd1ab9fdf4f23fd86a66112646"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.498223 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c8a6f0698f05db56db7f57e3f2bb2c8c9abc78ba0074a72d0395de435ebed130"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.498250 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3491d7e1df67d43284793d5f6326675b7d0fa2567f1c03fbea4368ba2185e97e"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.499762 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7c8795e9-9244-4cc4-a297-3aec68bf3588","Type":"ContainerStarted","Data":"f39f3390eb5e42a403a575333d110cbe5ece9b7617819b5c4f74d934848ba9f2"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.513211 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" event={"ID":"e866e388-01ab-407a-a59b-d0ba6c3f6f22","Type":"ContainerStarted","Data":"38078e4c5db1c573fe3af278cc88f79c3ee0f65a4ef482a0939cf7fb8c97e8cf"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.518311 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3292f88e2a346227223c8b7e045f4d492bf2fe48e5000128f262f9fb3fa3d4a3"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.518360 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"77219d655cb0cae9e03efcdbb81ebe6929aa8a097422f0b82111b60c7455dd94"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.520194 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cf657794c-phnhf"] Feb 27 00:09:57 crc kubenswrapper[4781]: E0227 00:09:57.545755 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-42hbx" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" Feb 27 00:09:57 crc kubenswrapper[4781]: E0227 00:09:57.546680 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-kqrgb" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.550390 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7c8795e9-9244-4cc4-a297-3aec68bf3588","Type":"ContainerStarted","Data":"2c434d493ffe4d5672fb6269468215eb15ce1d96ef38aac19ec03432d5d7c9b5"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.552978 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" event={"ID":"e866e388-01ab-407a-a59b-d0ba6c3f6f22","Type":"ContainerStarted","Data":"e8ae28edd5d5f135e7b1722851782a3a84c982ded51f1807027fdcab0564456f"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.553021 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" event={"ID":"e866e388-01ab-407a-a59b-d0ba6c3f6f22","Type":"ContainerStarted","Data":"ac3241ff1bfedd5f3c256c91fa7e49e33549d884b07d6bd12e0e7e945b3f26ef"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.554144 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"180f65d9-1cb5-411b-a031-6f97c06811d1","Type":"ContainerStarted","Data":"f593c27ca30b9776970bc50285b68eeb2a08fa45251c439f46326b20d1ecb4cb"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.554173 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"180f65d9-1cb5-411b-a031-6f97c06811d1","Type":"ContainerStarted","Data":"3684593fa27b2e9ecf581ed7146b0988ecfa46c54636096155683cb0b5d113f6"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.556134 4781 generic.go:334] "Generic (PLEG): container finished" podID="514049ae-2568-416f-9705-524c2bf74cbd" containerID="1701a618ae78c1968b5098401e32f2c349b0a0fd1ab9fdf4f23fd86a66112646" exitCode=0 Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.556156 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerDied","Data":"1701a618ae78c1968b5098401e32f2c349b0a0fd1ab9fdf4f23fd86a66112646"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.562203 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" event={"ID":"16dfdff5-f774-4b57-adcf-587eb1a87012","Type":"ContainerStarted","Data":"6afdb00675a8fb6a3b5a8c7988539248e23cd32e95fcaa349611040a5a9b0dd3"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.562244 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" event={"ID":"16dfdff5-f774-4b57-adcf-587eb1a87012","Type":"ContainerStarted","Data":"38ad01967f3090afd63325bd381f932e92e8200a882b87d6b61db57aba747513"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.562312 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" podUID="16dfdff5-f774-4b57-adcf-587eb1a87012" containerName="controller-manager" containerID="cri-o://6afdb00675a8fb6a3b5a8c7988539248e23cd32e95fcaa349611040a5a9b0dd3" gracePeriod=30 Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.562420 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.564937 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" event={"ID":"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9","Type":"ContainerStarted","Data":"9b2b3a0ab38d37a4eed10a4aaac7da312e7afd3ddf70467638573f0cec7e77bf"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.564984 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" event={"ID":"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9","Type":"ContainerStarted","Data":"afebd9373c32a2481ceafebd85a9c76dfa968abcad4e05173bfa83af78d85aa8"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.565596 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.565595 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" podUID="52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" containerName="route-controller-manager" containerID="cri-o://9b2b3a0ab38d37a4eed10a4aaac7da312e7afd3ddf70467638573f0cec7e77bf" gracePeriod=30 Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.565692 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.571776 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.599907 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=6.599893024 podStartE2EDuration="6.599893024s" podCreationTimestamp="2026-02-27 00:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:58.598736806 +0000 UTC m=+267.856276360" watchObservedRunningTime="2026-02-27 00:09:58.599893024 +0000 UTC m=+267.857432578" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.619560 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" podStartSLOduration=29.619543623 podStartE2EDuration="29.619543623s" podCreationTimestamp="2026-02-27 00:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:58.61859596 +0000 UTC m=+267.876135514" watchObservedRunningTime="2026-02-27 00:09:58.619543623 +0000 UTC m=+267.877083167" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.649358 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" podStartSLOduration=29.649343895 podStartE2EDuration="29.649343895s" podCreationTimestamp="2026-02-27 00:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:58.64673393 +0000 UTC m=+267.904273504" watchObservedRunningTime="2026-02-27 00:09:58.649343895 +0000 UTC m=+267.906883449" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.699677 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=10.699653247 podStartE2EDuration="10.699653247s" podCreationTimestamp="2026-02-27 00:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:58.697842812 +0000 UTC m=+267.955382386" watchObservedRunningTime="2026-02-27 00:09:58.699653247 +0000 UTC m=+267.957192811" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.571096 4781 generic.go:334] "Generic (PLEG): container finished" podID="16dfdff5-f774-4b57-adcf-587eb1a87012" containerID="6afdb00675a8fb6a3b5a8c7988539248e23cd32e95fcaa349611040a5a9b0dd3" exitCode=0 Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.571173 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" event={"ID":"16dfdff5-f774-4b57-adcf-587eb1a87012","Type":"ContainerDied","Data":"6afdb00675a8fb6a3b5a8c7988539248e23cd32e95fcaa349611040a5a9b0dd3"} Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.573154 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-8577b6d867-bbk7d_52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9/route-controller-manager/0.log" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.573188 4781 generic.go:334] "Generic (PLEG): container finished" podID="52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" containerID="9b2b3a0ab38d37a4eed10a4aaac7da312e7afd3ddf70467638573f0cec7e77bf" exitCode=255 Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.573237 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" event={"ID":"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9","Type":"ContainerDied","Data":"9b2b3a0ab38d37a4eed10a4aaac7da312e7afd3ddf70467638573f0cec7e77bf"} Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.574363 4781 generic.go:334] "Generic (PLEG): container finished" podID="180f65d9-1cb5-411b-a031-6f97c06811d1" containerID="f593c27ca30b9776970bc50285b68eeb2a08fa45251c439f46326b20d1ecb4cb" exitCode=0 Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.574444 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"180f65d9-1cb5-411b-a031-6f97c06811d1","Type":"ContainerDied","Data":"f593c27ca30b9776970bc50285b68eeb2a08fa45251c439f46326b20d1ecb4cb"} Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.575281 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.575326 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.587357 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kpnjj" podStartSLOduration=213.587341259 podStartE2EDuration="3m33.587341259s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:59.586422356 +0000 UTC m=+268.843961910" watchObservedRunningTime="2026-02-27 00:09:59.587341259 +0000 UTC m=+268.844880813" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.658500 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.681592 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f9656d97f-jxdvc"] Feb 27 00:09:59 crc kubenswrapper[4781]: E0227 00:09:59.681821 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16dfdff5-f774-4b57-adcf-587eb1a87012" containerName="controller-manager" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.681837 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dfdff5-f774-4b57-adcf-587eb1a87012" containerName="controller-manager" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.681933 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="16dfdff5-f774-4b57-adcf-587eb1a87012" containerName="controller-manager" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.682298 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.698384 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f9656d97f-jxdvc"] Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.725654 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dfdff5-f774-4b57-adcf-587eb1a87012-serving-cert\") pod \"16dfdff5-f774-4b57-adcf-587eb1a87012\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.725747 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-config\") pod \"16dfdff5-f774-4b57-adcf-587eb1a87012\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.725780 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vtpb\" (UniqueName: \"kubernetes.io/projected/16dfdff5-f774-4b57-adcf-587eb1a87012-kube-api-access-2vtpb\") pod \"16dfdff5-f774-4b57-adcf-587eb1a87012\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.725810 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-proxy-ca-bundles\") pod \"16dfdff5-f774-4b57-adcf-587eb1a87012\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.725865 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-client-ca\") pod \"16dfdff5-f774-4b57-adcf-587eb1a87012\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.727338 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "16dfdff5-f774-4b57-adcf-587eb1a87012" (UID: "16dfdff5-f774-4b57-adcf-587eb1a87012"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.727371 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-client-ca" (OuterVolumeSpecName: "client-ca") pod "16dfdff5-f774-4b57-adcf-587eb1a87012" (UID: "16dfdff5-f774-4b57-adcf-587eb1a87012"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.727755 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-config" (OuterVolumeSpecName: "config") pod "16dfdff5-f774-4b57-adcf-587eb1a87012" (UID: "16dfdff5-f774-4b57-adcf-587eb1a87012"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.739888 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16dfdff5-f774-4b57-adcf-587eb1a87012-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16dfdff5-f774-4b57-adcf-587eb1a87012" (UID: "16dfdff5-f774-4b57-adcf-587eb1a87012"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.739934 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16dfdff5-f774-4b57-adcf-587eb1a87012-kube-api-access-2vtpb" (OuterVolumeSpecName: "kube-api-access-2vtpb") pod "16dfdff5-f774-4b57-adcf-587eb1a87012" (UID: "16dfdff5-f774-4b57-adcf-587eb1a87012"). InnerVolumeSpecName "kube-api-access-2vtpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827052 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c936ed-5081-4365-87f2-90f0cc29bb4e-serving-cert\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827124 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-client-ca\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827211 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-config\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827232 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg4qp\" (UniqueName: \"kubernetes.io/projected/25c936ed-5081-4365-87f2-90f0cc29bb4e-kube-api-access-vg4qp\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827344 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-proxy-ca-bundles\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827382 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dfdff5-f774-4b57-adcf-587eb1a87012-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827396 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827405 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vtpb\" (UniqueName: \"kubernetes.io/projected/16dfdff5-f774-4b57-adcf-587eb1a87012-kube-api-access-2vtpb\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827427 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827529 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.928751 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-proxy-ca-bundles\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.928788 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c936ed-5081-4365-87f2-90f0cc29bb4e-serving-cert\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.928823 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-client-ca\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.928847 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-config\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.928872 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg4qp\" (UniqueName: \"kubernetes.io/projected/25c936ed-5081-4365-87f2-90f0cc29bb4e-kube-api-access-vg4qp\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.930486 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-client-ca\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.930571 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-config\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.931089 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-proxy-ca-bundles\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.943079 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c936ed-5081-4365-87f2-90f0cc29bb4e-serving-cert\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.944585 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-8577b6d867-bbk7d_52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9/route-controller-manager/0.log" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.944660 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.945451 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg4qp\" (UniqueName: \"kubernetes.io/projected/25c936ed-5081-4365-87f2-90f0cc29bb4e-kube-api-access-vg4qp\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.999267 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.029992 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpsw7\" (UniqueName: \"kubernetes.io/projected/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-kube-api-access-qpsw7\") pod \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.030309 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-config\") pod \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.030464 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-serving-cert\") pod \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.030536 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-client-ca\") pod \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.031438 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-client-ca" (OuterVolumeSpecName: "client-ca") pod "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" (UID: "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.031550 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-config" (OuterVolumeSpecName: "config") pod "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" (UID: "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.031799 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.031868 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.035882 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" (UID: "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.035902 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-kube-api-access-qpsw7" (OuterVolumeSpecName: "kube-api-access-qpsw7") pod "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" (UID: "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9"). InnerVolumeSpecName "kube-api-access-qpsw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.129509 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535850-wzxmm"] Feb 27 00:10:00 crc kubenswrapper[4781]: E0227 00:10:00.130024 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" containerName="route-controller-manager" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.130043 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" containerName="route-controller-manager" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.130150 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" containerName="route-controller-manager" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.130477 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.134046 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.134562 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.134593 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpsw7\" (UniqueName: \"kubernetes.io/projected/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-kube-api-access-qpsw7\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.136041 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535850-wzxmm"] Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.174597 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f9656d97f-jxdvc"] Feb 27 00:10:00 crc kubenswrapper[4781]: W0227 00:10:00.187306 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25c936ed_5081_4365_87f2_90f0cc29bb4e.slice/crio-a674bf23a48c6f2ceef7ad1a8e9b60a25ce02d22f7541a7f5589c91470d8adaf WatchSource:0}: Error finding container a674bf23a48c6f2ceef7ad1a8e9b60a25ce02d22f7541a7f5589c91470d8adaf: Status 404 returned error can't find the container with id a674bf23a48c6f2ceef7ad1a8e9b60a25ce02d22f7541a7f5589c91470d8adaf Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.236033 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tthh\" (UniqueName: \"kubernetes.io/projected/6acff23f-a17a-4f43-a7d6-32c8ccf4b084-kube-api-access-5tthh\") pod \"auto-csr-approver-29535850-wzxmm\" (UID: \"6acff23f-a17a-4f43-a7d6-32c8ccf4b084\") " pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.337213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tthh\" (UniqueName: \"kubernetes.io/projected/6acff23f-a17a-4f43-a7d6-32c8ccf4b084-kube-api-access-5tthh\") pod \"auto-csr-approver-29535850-wzxmm\" (UID: \"6acff23f-a17a-4f43-a7d6-32c8ccf4b084\") " pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.358370 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tthh\" (UniqueName: \"kubernetes.io/projected/6acff23f-a17a-4f43-a7d6-32c8ccf4b084-kube-api-access-5tthh\") pod \"auto-csr-approver-29535850-wzxmm\" (UID: \"6acff23f-a17a-4f43-a7d6-32c8ccf4b084\") " pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.461735 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.584119 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" event={"ID":"25c936ed-5081-4365-87f2-90f0cc29bb4e","Type":"ContainerStarted","Data":"02ad07479bdf08eecb193dd8838b03c4aed4d9358ea1b1b4013fddaa4dd4cb06"} Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.584168 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" event={"ID":"25c936ed-5081-4365-87f2-90f0cc29bb4e","Type":"ContainerStarted","Data":"a674bf23a48c6f2ceef7ad1a8e9b60a25ce02d22f7541a7f5589c91470d8adaf"} Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.584726 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.587749 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" event={"ID":"16dfdff5-f774-4b57-adcf-587eb1a87012","Type":"ContainerDied","Data":"38ad01967f3090afd63325bd381f932e92e8200a882b87d6b61db57aba747513"} Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.587792 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.587822 4781 scope.go:117] "RemoveContainer" containerID="6afdb00675a8fb6a3b5a8c7988539248e23cd32e95fcaa349611040a5a9b0dd3" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.590054 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.603145 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-8577b6d867-bbk7d_52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9/route-controller-manager/0.log" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.603225 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" event={"ID":"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9","Type":"ContainerDied","Data":"afebd9373c32a2481ceafebd85a9c76dfa968abcad4e05173bfa83af78d85aa8"} Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.603313 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.608935 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" podStartSLOduration=11.608910591 podStartE2EDuration="11.608910591s" podCreationTimestamp="2026-02-27 00:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:10:00.60324034 +0000 UTC m=+269.860779904" watchObservedRunningTime="2026-02-27 00:10:00.608910591 +0000 UTC m=+269.866450155" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.609643 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rnj7" event={"ID":"97e44b43-3c8e-4065-a51b-aa3f27c36712","Type":"ContainerStarted","Data":"ae64f7e1eb7e26c6ed833c20f9f08953b97623d771df012e1a6915bbe9ef458b"} Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.662487 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d"] Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.671884 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d"] Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.676442 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5rnj7" podStartSLOduration=3.989136788 podStartE2EDuration="47.676429861s" podCreationTimestamp="2026-02-27 00:09:13 +0000 UTC" firstStartedPulling="2026-02-27 00:09:16.09589659 +0000 UTC m=+225.353436144" lastFinishedPulling="2026-02-27 00:09:59.783189663 +0000 UTC m=+269.040729217" observedRunningTime="2026-02-27 00:10:00.674887933 +0000 UTC m=+269.932427487" watchObservedRunningTime="2026-02-27 00:10:00.676429861 +0000 UTC m=+269.933969415" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.691447 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cf657794c-phnhf"] Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.698872 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5cf657794c-phnhf"] Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.853833 4781 scope.go:117] "RemoveContainer" containerID="9b2b3a0ab38d37a4eed10a4aaac7da312e7afd3ddf70467638573f0cec7e77bf" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.051316 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.125233 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.269112 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180f65d9-1cb5-411b-a031-6f97c06811d1-kube-api-access\") pod \"180f65d9-1cb5-411b-a031-6f97c06811d1\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.269314 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180f65d9-1cb5-411b-a031-6f97c06811d1-kubelet-dir\") pod \"180f65d9-1cb5-411b-a031-6f97c06811d1\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.269680 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/180f65d9-1cb5-411b-a031-6f97c06811d1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "180f65d9-1cb5-411b-a031-6f97c06811d1" (UID: "180f65d9-1cb5-411b-a031-6f97c06811d1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.277787 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180f65d9-1cb5-411b-a031-6f97c06811d1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "180f65d9-1cb5-411b-a031-6f97c06811d1" (UID: "180f65d9-1cb5-411b-a031-6f97c06811d1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.321950 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16dfdff5-f774-4b57-adcf-587eb1a87012" path="/var/lib/kubelet/pods/16dfdff5-f774-4b57-adcf-587eb1a87012/volumes" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.323665 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" path="/var/lib/kubelet/pods/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9/volumes" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.359326 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535850-wzxmm"] Feb 27 00:10:01 crc kubenswrapper[4781]: W0227 00:10:01.366362 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6acff23f_a17a_4f43_a7d6_32c8ccf4b084.slice/crio-f728769b2293efcebad50430f72dd2a7fcc03a69d6e7c8ee03493c1d72f21662 WatchSource:0}: Error finding container f728769b2293efcebad50430f72dd2a7fcc03a69d6e7c8ee03493c1d72f21662: Status 404 returned error can't find the container with id f728769b2293efcebad50430f72dd2a7fcc03a69d6e7c8ee03493c1d72f21662 Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.370881 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180f65d9-1cb5-411b-a031-6f97c06811d1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.370921 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180f65d9-1cb5-411b-a031-6f97c06811d1-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.616166 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"180f65d9-1cb5-411b-a031-6f97c06811d1","Type":"ContainerDied","Data":"3684593fa27b2e9ecf581ed7146b0988ecfa46c54636096155683cb0b5d113f6"} Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.616217 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3684593fa27b2e9ecf581ed7146b0988ecfa46c54636096155683cb0b5d113f6" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.616180 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.619268 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" event={"ID":"6acff23f-a17a-4f43-a7d6-32c8ccf4b084","Type":"ContainerStarted","Data":"f728769b2293efcebad50430f72dd2a7fcc03a69d6e7c8ee03493c1d72f21662"} Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.101959 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b"] Feb 27 00:10:02 crc kubenswrapper[4781]: E0227 00:10:02.102177 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180f65d9-1cb5-411b-a031-6f97c06811d1" containerName="pruner" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.102188 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="180f65d9-1cb5-411b-a031-6f97c06811d1" containerName="pruner" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.102285 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="180f65d9-1cb5-411b-a031-6f97c06811d1" containerName="pruner" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.102981 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.105536 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.106794 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.107035 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.107092 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.107253 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.107714 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.111554 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b"] Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.282473 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-client-ca\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.282524 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-config\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.282573 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxn2w\" (UniqueName: \"kubernetes.io/projected/24aa2757-b776-45d4-b3b5-29f891553b70-kube-api-access-qxn2w\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.282779 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24aa2757-b776-45d4-b3b5-29f891553b70-serving-cert\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.384369 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24aa2757-b776-45d4-b3b5-29f891553b70-serving-cert\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.384435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-client-ca\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.384468 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-config\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.384515 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxn2w\" (UniqueName: \"kubernetes.io/projected/24aa2757-b776-45d4-b3b5-29f891553b70-kube-api-access-qxn2w\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.385750 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-client-ca\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.385860 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-config\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.392235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24aa2757-b776-45d4-b3b5-29f891553b70-serving-cert\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.400552 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxn2w\" (UniqueName: \"kubernetes.io/projected/24aa2757-b776-45d4-b3b5-29f891553b70-kube-api-access-qxn2w\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.427483 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.626820 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerStarted","Data":"6f6652462de4a86b8baaec52317db0d18a1393af64c4cae6b12430f98c10a218"} Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.642448 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b"] Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.646239 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dj7h5" podStartSLOduration=21.235177912 podStartE2EDuration="47.646218913s" podCreationTimestamp="2026-02-27 00:09:15 +0000 UTC" firstStartedPulling="2026-02-27 00:09:34.751025076 +0000 UTC m=+244.008564670" lastFinishedPulling="2026-02-27 00:10:01.162066107 +0000 UTC m=+270.419605671" observedRunningTime="2026-02-27 00:10:02.645025183 +0000 UTC m=+271.902564747" watchObservedRunningTime="2026-02-27 00:10:02.646218913 +0000 UTC m=+271.903758467" Feb 27 00:10:03 crc kubenswrapper[4781]: I0227 00:10:03.632930 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" event={"ID":"24aa2757-b776-45d4-b3b5-29f891553b70","Type":"ContainerStarted","Data":"71513888e0482af4d621781d19007e82957003c877ba4a7e270d5f9d9e9840db"} Feb 27 00:10:04 crc kubenswrapper[4781]: I0227 00:10:04.177713 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:10:04 crc kubenswrapper[4781]: I0227 00:10:04.177783 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:10:04 crc kubenswrapper[4781]: I0227 00:10:04.701150 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:10:04 crc kubenswrapper[4781]: I0227 00:10:04.972325 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:10:05 crc kubenswrapper[4781]: I0227 00:10:05.537741 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:10:05 crc kubenswrapper[4781]: I0227 00:10:05.537858 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.582039 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dj7h5" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="registry-server" probeResult="failure" output=< Feb 27 00:10:06 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:10:06 crc kubenswrapper[4781]: > Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.647689 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" event={"ID":"6acff23f-a17a-4f43-a7d6-32c8ccf4b084","Type":"ContainerStarted","Data":"313dbdb071dff64579864e870a0b09038434fbe0ef138af4cad66cd56ba9ca0d"} Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.648999 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535848-ccctv" event={"ID":"df035290-8e3c-422b-90ac-573b592defcf","Type":"ContainerStarted","Data":"a316b4241144a66af579b620906b51669485f94b0371b42e5c56ba88e48d2942"} Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.650225 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" event={"ID":"24aa2757-b776-45d4-b3b5-29f891553b70","Type":"ContainerStarted","Data":"dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba"} Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.663399 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535848-ccctv" podStartSLOduration=67.952041111 podStartE2EDuration="2m6.663378975s" podCreationTimestamp="2026-02-27 00:08:00 +0000 UTC" firstStartedPulling="2026-02-27 00:09:07.214773423 +0000 UTC m=+216.472312977" lastFinishedPulling="2026-02-27 00:10:05.926111297 +0000 UTC m=+275.183650841" observedRunningTime="2026-02-27 00:10:06.659507019 +0000 UTC m=+275.917046593" watchObservedRunningTime="2026-02-27 00:10:06.663378975 +0000 UTC m=+275.920918529" Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.907801 4781 csr.go:261] certificate signing request csr-jc8qj is approved, waiting to be issued Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.915786 4781 csr.go:257] certificate signing request csr-jc8qj is issued Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.656859 4781 generic.go:334] "Generic (PLEG): container finished" podID="6acff23f-a17a-4f43-a7d6-32c8ccf4b084" containerID="313dbdb071dff64579864e870a0b09038434fbe0ef138af4cad66cd56ba9ca0d" exitCode=0 Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.656954 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" event={"ID":"6acff23f-a17a-4f43-a7d6-32c8ccf4b084","Type":"ContainerDied","Data":"313dbdb071dff64579864e870a0b09038434fbe0ef138af4cad66cd56ba9ca0d"} Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.658891 4781 generic.go:334] "Generic (PLEG): container finished" podID="df035290-8e3c-422b-90ac-573b592defcf" containerID="a316b4241144a66af579b620906b51669485f94b0371b42e5c56ba88e48d2942" exitCode=0 Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.658962 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535848-ccctv" event={"ID":"df035290-8e3c-422b-90ac-573b592defcf","Type":"ContainerDied","Data":"a316b4241144a66af579b620906b51669485f94b0371b42e5c56ba88e48d2942"} Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.659248 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.666980 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.683743 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" podStartSLOduration=18.683724897 podStartE2EDuration="18.683724897s" podCreationTimestamp="2026-02-27 00:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:10:07.682868166 +0000 UTC m=+276.940407720" watchObservedRunningTime="2026-02-27 00:10:07.683724897 +0000 UTC m=+276.941264441" Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.917565 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-09 00:41:14.368927459 +0000 UTC Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.917610 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7584h31m6.451319371s for next certificate rotation Feb 27 00:10:08 crc kubenswrapper[4781]: I0227 00:10:08.977682 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.088148 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tthh\" (UniqueName: \"kubernetes.io/projected/6acff23f-a17a-4f43-a7d6-32c8ccf4b084-kube-api-access-5tthh\") pod \"6acff23f-a17a-4f43-a7d6-32c8ccf4b084\" (UID: \"6acff23f-a17a-4f43-a7d6-32c8ccf4b084\") " Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.091730 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.093393 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6acff23f-a17a-4f43-a7d6-32c8ccf4b084-kube-api-access-5tthh" (OuterVolumeSpecName: "kube-api-access-5tthh") pod "6acff23f-a17a-4f43-a7d6-32c8ccf4b084" (UID: "6acff23f-a17a-4f43-a7d6-32c8ccf4b084"). InnerVolumeSpecName "kube-api-access-5tthh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.193752 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv9hp\" (UniqueName: \"kubernetes.io/projected/df035290-8e3c-422b-90ac-573b592defcf-kube-api-access-mv9hp\") pod \"df035290-8e3c-422b-90ac-573b592defcf\" (UID: \"df035290-8e3c-422b-90ac-573b592defcf\") " Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.194128 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tthh\" (UniqueName: \"kubernetes.io/projected/6acff23f-a17a-4f43-a7d6-32c8ccf4b084-kube-api-access-5tthh\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.198118 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df035290-8e3c-422b-90ac-573b592defcf-kube-api-access-mv9hp" (OuterVolumeSpecName: "kube-api-access-mv9hp") pod "df035290-8e3c-422b-90ac-573b592defcf" (UID: "df035290-8e3c-422b-90ac-573b592defcf"). InnerVolumeSpecName "kube-api-access-mv9hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.294982 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv9hp\" (UniqueName: \"kubernetes.io/projected/df035290-8e3c-422b-90ac-573b592defcf-kube-api-access-mv9hp\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.667137 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f9656d97f-jxdvc"] Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.667434 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" podUID="25c936ed-5081-4365-87f2-90f0cc29bb4e" containerName="controller-manager" containerID="cri-o://02ad07479bdf08eecb193dd8838b03c4aed4d9358ea1b1b4013fddaa4dd4cb06" gracePeriod=30 Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.676140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535848-ccctv" event={"ID":"df035290-8e3c-422b-90ac-573b592defcf","Type":"ContainerDied","Data":"73bd0b78edcc81c67b914cc89cfaf8646b9814d5783ad5e9856330864dac671a"} Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.676181 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73bd0b78edcc81c67b914cc89cfaf8646b9814d5783ad5e9856330864dac671a" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.676250 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.677786 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" event={"ID":"6acff23f-a17a-4f43-a7d6-32c8ccf4b084","Type":"ContainerDied","Data":"f728769b2293efcebad50430f72dd2a7fcc03a69d6e7c8ee03493c1d72f21662"} Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.677860 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f728769b2293efcebad50430f72dd2a7fcc03a69d6e7c8ee03493c1d72f21662" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.677815 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.695271 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b"] Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.000905 4781 patch_prober.go:28] interesting pod/controller-manager-5f9656d97f-jxdvc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.000982 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" podUID="25c936ed-5081-4365-87f2-90f0cc29bb4e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.686027 4781 generic.go:334] "Generic (PLEG): container finished" podID="25c936ed-5081-4365-87f2-90f0cc29bb4e" containerID="02ad07479bdf08eecb193dd8838b03c4aed4d9358ea1b1b4013fddaa4dd4cb06" exitCode=0 Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.686081 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" event={"ID":"25c936ed-5081-4365-87f2-90f0cc29bb4e","Type":"ContainerDied","Data":"02ad07479bdf08eecb193dd8838b03c4aed4d9358ea1b1b4013fddaa4dd4cb06"} Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.688316 4781 generic.go:334] "Generic (PLEG): container finished" podID="baa593f3-06c4-461f-a893-609b07dfd282" containerID="c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f" exitCode=0 Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.688421 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ngbg" event={"ID":"baa593f3-06c4-461f-a893-609b07dfd282","Type":"ContainerDied","Data":"c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f"} Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.688495 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" podUID="24aa2757-b776-45d4-b3b5-29f891553b70" containerName="route-controller-manager" containerID="cri-o://dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba" gracePeriod=30 Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.814471 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.858968 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dccb78d65-ddtvh"] Feb 27 00:10:10 crc kubenswrapper[4781]: E0227 00:10:10.859767 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df035290-8e3c-422b-90ac-573b592defcf" containerName="oc" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.859834 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="df035290-8e3c-422b-90ac-573b592defcf" containerName="oc" Feb 27 00:10:10 crc kubenswrapper[4781]: E0227 00:10:10.859852 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c936ed-5081-4365-87f2-90f0cc29bb4e" containerName="controller-manager" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.859859 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c936ed-5081-4365-87f2-90f0cc29bb4e" containerName="controller-manager" Feb 27 00:10:10 crc kubenswrapper[4781]: E0227 00:10:10.859892 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6acff23f-a17a-4f43-a7d6-32c8ccf4b084" containerName="oc" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.859900 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6acff23f-a17a-4f43-a7d6-32c8ccf4b084" containerName="oc" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.860077 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6acff23f-a17a-4f43-a7d6-32c8ccf4b084" containerName="oc" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.860095 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="df035290-8e3c-422b-90ac-573b592defcf" containerName="oc" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.860135 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c936ed-5081-4365-87f2-90f0cc29bb4e" containerName="controller-manager" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.862501 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.873731 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dccb78d65-ddtvh"] Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.919932 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-proxy-ca-bundles\") pod \"25c936ed-5081-4365-87f2-90f0cc29bb4e\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920000 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c936ed-5081-4365-87f2-90f0cc29bb4e-serving-cert\") pod \"25c936ed-5081-4365-87f2-90f0cc29bb4e\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920042 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-client-ca\") pod \"25c936ed-5081-4365-87f2-90f0cc29bb4e\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920075 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-config\") pod \"25c936ed-5081-4365-87f2-90f0cc29bb4e\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920117 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg4qp\" (UniqueName: \"kubernetes.io/projected/25c936ed-5081-4365-87f2-90f0cc29bb4e-kube-api-access-vg4qp\") pod \"25c936ed-5081-4365-87f2-90f0cc29bb4e\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920230 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6rqm\" (UniqueName: \"kubernetes.io/projected/62880941-3b5d-4517-b0df-8c5548f8298d-kube-api-access-h6rqm\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920282 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62880941-3b5d-4517-b0df-8c5548f8298d-serving-cert\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920325 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-proxy-ca-bundles\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920415 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-config\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920450 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-client-ca\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.921251 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-client-ca" (OuterVolumeSpecName: "client-ca") pod "25c936ed-5081-4365-87f2-90f0cc29bb4e" (UID: "25c936ed-5081-4365-87f2-90f0cc29bb4e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.921373 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-config" (OuterVolumeSpecName: "config") pod "25c936ed-5081-4365-87f2-90f0cc29bb4e" (UID: "25c936ed-5081-4365-87f2-90f0cc29bb4e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.922200 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "25c936ed-5081-4365-87f2-90f0cc29bb4e" (UID: "25c936ed-5081-4365-87f2-90f0cc29bb4e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.926870 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25c936ed-5081-4365-87f2-90f0cc29bb4e-kube-api-access-vg4qp" (OuterVolumeSpecName: "kube-api-access-vg4qp") pod "25c936ed-5081-4365-87f2-90f0cc29bb4e" (UID: "25c936ed-5081-4365-87f2-90f0cc29bb4e"). InnerVolumeSpecName "kube-api-access-vg4qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.927351 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c936ed-5081-4365-87f2-90f0cc29bb4e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "25c936ed-5081-4365-87f2-90f0cc29bb4e" (UID: "25c936ed-5081-4365-87f2-90f0cc29bb4e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025065 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-proxy-ca-bundles\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025145 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-config\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025173 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-client-ca\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025199 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6rqm\" (UniqueName: \"kubernetes.io/projected/62880941-3b5d-4517-b0df-8c5548f8298d-kube-api-access-h6rqm\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025224 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62880941-3b5d-4517-b0df-8c5548f8298d-serving-cert\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025257 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c936ed-5081-4365-87f2-90f0cc29bb4e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025267 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025277 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025286 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg4qp\" (UniqueName: \"kubernetes.io/projected/25c936ed-5081-4365-87f2-90f0cc29bb4e-kube-api-access-vg4qp\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025295 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.026513 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-client-ca\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.027228 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-proxy-ca-bundles\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.027383 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-config\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.034093 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62880941-3b5d-4517-b0df-8c5548f8298d-serving-cert\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.046992 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6rqm\" (UniqueName: \"kubernetes.io/projected/62880941-3b5d-4517-b0df-8c5548f8298d-kube-api-access-h6rqm\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.112709 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.125618 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-client-ca\") pod \"24aa2757-b776-45d4-b3b5-29f891553b70\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.125760 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxn2w\" (UniqueName: \"kubernetes.io/projected/24aa2757-b776-45d4-b3b5-29f891553b70-kube-api-access-qxn2w\") pod \"24aa2757-b776-45d4-b3b5-29f891553b70\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.125792 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-config\") pod \"24aa2757-b776-45d4-b3b5-29f891553b70\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.125814 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24aa2757-b776-45d4-b3b5-29f891553b70-serving-cert\") pod \"24aa2757-b776-45d4-b3b5-29f891553b70\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.128217 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-config" (OuterVolumeSpecName: "config") pod "24aa2757-b776-45d4-b3b5-29f891553b70" (UID: "24aa2757-b776-45d4-b3b5-29f891553b70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.128206 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-client-ca" (OuterVolumeSpecName: "client-ca") pod "24aa2757-b776-45d4-b3b5-29f891553b70" (UID: "24aa2757-b776-45d4-b3b5-29f891553b70"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.132443 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24aa2757-b776-45d4-b3b5-29f891553b70-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "24aa2757-b776-45d4-b3b5-29f891553b70" (UID: "24aa2757-b776-45d4-b3b5-29f891553b70"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.132832 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24aa2757-b776-45d4-b3b5-29f891553b70-kube-api-access-qxn2w" (OuterVolumeSpecName: "kube-api-access-qxn2w") pod "24aa2757-b776-45d4-b3b5-29f891553b70" (UID: "24aa2757-b776-45d4-b3b5-29f891553b70"). InnerVolumeSpecName "kube-api-access-qxn2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.181223 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.226994 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.227029 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24aa2757-b776-45d4-b3b5-29f891553b70-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.227038 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.227050 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxn2w\" (UniqueName: \"kubernetes.io/projected/24aa2757-b776-45d4-b3b5-29f891553b70-kube-api-access-qxn2w\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.428437 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dccb78d65-ddtvh"] Feb 27 00:10:11 crc kubenswrapper[4781]: W0227 00:10:11.435477 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62880941_3b5d_4517_b0df_8c5548f8298d.slice/crio-b10681d24475519937e6ebec881f7fb610590065caed7686034ef5962eec396d WatchSource:0}: Error finding container b10681d24475519937e6ebec881f7fb610590065caed7686034ef5962eec396d: Status 404 returned error can't find the container with id b10681d24475519937e6ebec881f7fb610590065caed7686034ef5962eec396d Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.699111 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ngbg" event={"ID":"baa593f3-06c4-461f-a893-609b07dfd282","Type":"ContainerStarted","Data":"282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.700612 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerStarted","Data":"a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.701465 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" event={"ID":"62880941-3b5d-4517-b0df-8c5548f8298d","Type":"ContainerStarted","Data":"b10681d24475519937e6ebec881f7fb610590065caed7686034ef5962eec396d"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.702682 4781 generic.go:334] "Generic (PLEG): container finished" podID="24aa2757-b776-45d4-b3b5-29f891553b70" containerID="dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba" exitCode=0 Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.702733 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" event={"ID":"24aa2757-b776-45d4-b3b5-29f891553b70","Type":"ContainerDied","Data":"dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.702730 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.702766 4781 scope.go:117] "RemoveContainer" containerID="dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.702755 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" event={"ID":"24aa2757-b776-45d4-b3b5-29f891553b70","Type":"ContainerDied","Data":"71513888e0482af4d621781d19007e82957003c877ba4a7e270d5f9d9e9840db"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.706479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" event={"ID":"25c936ed-5081-4365-87f2-90f0cc29bb4e","Type":"ContainerDied","Data":"a674bf23a48c6f2ceef7ad1a8e9b60a25ce02d22f7541a7f5589c91470d8adaf"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.706608 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.711238 4781 generic.go:334] "Generic (PLEG): container finished" podID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerID="7a5bc22436045a92f14d9e48387b73688e7285010edca28bce2bf80e2706ff98" exitCode=0 Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.711307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqrgb" event={"ID":"ac30245d-7e42-440c-99a0-60e2ae15cb8b","Type":"ContainerDied","Data":"7a5bc22436045a92f14d9e48387b73688e7285010edca28bce2bf80e2706ff98"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.717877 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9ngbg" podStartSLOduration=3.735382734 podStartE2EDuration="58.717858732s" podCreationTimestamp="2026-02-27 00:09:13 +0000 UTC" firstStartedPulling="2026-02-27 00:09:16.175931911 +0000 UTC m=+225.433471465" lastFinishedPulling="2026-02-27 00:10:11.158407909 +0000 UTC m=+280.415947463" observedRunningTime="2026-02-27 00:10:11.716802006 +0000 UTC m=+280.974341590" watchObservedRunningTime="2026-02-27 00:10:11.717858732 +0000 UTC m=+280.975398286" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.724622 4781 scope.go:117] "RemoveContainer" containerID="dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba" Feb 27 00:10:11 crc kubenswrapper[4781]: E0227 00:10:11.725209 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba\": container with ID starting with dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba not found: ID does not exist" containerID="dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.725273 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba"} err="failed to get container status \"dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba\": rpc error: code = NotFound desc = could not find container \"dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba\": container with ID starting with dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba not found: ID does not exist" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.725317 4781 scope.go:117] "RemoveContainer" containerID="02ad07479bdf08eecb193dd8838b03c4aed4d9358ea1b1b4013fddaa4dd4cb06" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.735402 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b"] Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.752573 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b"] Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.756929 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f9656d97f-jxdvc"] Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.759740 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f9656d97f-jxdvc"] Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.718085 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" event={"ID":"62880941-3b5d-4517-b0df-8c5548f8298d","Type":"ContainerStarted","Data":"b8e38f1e216924c6d8d1fbab43dc9e21f5123054b598be25b535e4702b38b353"} Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.719557 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.734968 4781 generic.go:334] "Generic (PLEG): container finished" podID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerID="a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093" exitCode=0 Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.735023 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerDied","Data":"a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093"} Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.736234 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.746524 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" podStartSLOduration=3.7464937320000002 podStartE2EDuration="3.746493732s" podCreationTimestamp="2026-02-27 00:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:10:12.742330348 +0000 UTC m=+281.999869902" watchObservedRunningTime="2026-02-27 00:10:12.746493732 +0000 UTC m=+282.004033276" Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.895557 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.895664 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.895729 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.896483 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.896564 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089" gracePeriod=600 Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.112697 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl"] Feb 27 00:10:13 crc kubenswrapper[4781]: E0227 00:10:13.112941 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24aa2757-b776-45d4-b3b5-29f891553b70" containerName="route-controller-manager" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.112953 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="24aa2757-b776-45d4-b3b5-29f891553b70" containerName="route-controller-manager" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.113051 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="24aa2757-b776-45d4-b3b5-29f891553b70" containerName="route-controller-manager" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.113511 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.115603 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.117798 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.117802 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.118010 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.119538 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.119579 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.129268 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl"] Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.271560 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txcb7\" (UniqueName: \"kubernetes.io/projected/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-kube-api-access-txcb7\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.271676 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-client-ca\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.271766 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-serving-cert\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.271789 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-config\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.323234 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24aa2757-b776-45d4-b3b5-29f891553b70" path="/var/lib/kubelet/pods/24aa2757-b776-45d4-b3b5-29f891553b70/volumes" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.324118 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25c936ed-5081-4365-87f2-90f0cc29bb4e" path="/var/lib/kubelet/pods/25c936ed-5081-4365-87f2-90f0cc29bb4e/volumes" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.372548 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-serving-cert\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.372855 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-config\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.372922 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txcb7\" (UniqueName: \"kubernetes.io/projected/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-kube-api-access-txcb7\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.372970 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-client-ca\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.373840 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-config\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.373884 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-client-ca\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.379481 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-serving-cert\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.391334 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txcb7\" (UniqueName: \"kubernetes.io/projected/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-kube-api-access-txcb7\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.433151 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.745552 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089" exitCode=0 Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.745688 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089"} Feb 27 00:10:14 crc kubenswrapper[4781]: I0227 00:10:14.076539 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:10:14 crc kubenswrapper[4781]: I0227 00:10:14.076602 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:10:14 crc kubenswrapper[4781]: I0227 00:10:14.159029 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:10:14 crc kubenswrapper[4781]: I0227 00:10:14.231032 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:10:15 crc kubenswrapper[4781]: I0227 00:10:15.600797 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:10:15 crc kubenswrapper[4781]: I0227 00:10:15.722322 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:10:18 crc kubenswrapper[4781]: I0227 00:10:18.147940 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rnj7"] Feb 27 00:10:18 crc kubenswrapper[4781]: I0227 00:10:18.148371 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5rnj7" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="registry-server" containerID="cri-o://ae64f7e1eb7e26c6ed833c20f9f08953b97623d771df012e1a6915bbe9ef458b" gracePeriod=2 Feb 27 00:10:19 crc kubenswrapper[4781]: I0227 00:10:19.146499 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dj7h5"] Feb 27 00:10:19 crc kubenswrapper[4781]: I0227 00:10:19.147282 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dj7h5" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="registry-server" containerID="cri-o://6f6652462de4a86b8baaec52317db0d18a1393af64c4cae6b12430f98c10a218" gracePeriod=2 Feb 27 00:10:19 crc kubenswrapper[4781]: I0227 00:10:19.796507 4781 generic.go:334] "Generic (PLEG): container finished" podID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerID="ae64f7e1eb7e26c6ed833c20f9f08953b97623d771df012e1a6915bbe9ef458b" exitCode=0 Feb 27 00:10:19 crc kubenswrapper[4781]: I0227 00:10:19.796572 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rnj7" event={"ID":"97e44b43-3c8e-4065-a51b-aa3f27c36712","Type":"ContainerDied","Data":"ae64f7e1eb7e26c6ed833c20f9f08953b97623d771df012e1a6915bbe9ef458b"} Feb 27 00:10:20 crc kubenswrapper[4781]: I0227 00:10:20.802673 4781 generic.go:334] "Generic (PLEG): container finished" podID="514049ae-2568-416f-9705-524c2bf74cbd" containerID="6f6652462de4a86b8baaec52317db0d18a1393af64c4cae6b12430f98c10a218" exitCode=0 Feb 27 00:10:20 crc kubenswrapper[4781]: I0227 00:10:20.802709 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerDied","Data":"6f6652462de4a86b8baaec52317db0d18a1393af64c4cae6b12430f98c10a218"} Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.786178 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.790878 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.810523 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rnj7" event={"ID":"97e44b43-3c8e-4065-a51b-aa3f27c36712","Type":"ContainerDied","Data":"b4c78b3d5964c2a730f268fed158cc29cd746663976e63644c0b8dcc232f4b12"} Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.810579 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.810583 4781 scope.go:117] "RemoveContainer" containerID="ae64f7e1eb7e26c6ed833c20f9f08953b97623d771df012e1a6915bbe9ef458b" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.814658 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerDied","Data":"45d5d509e8ad0dc50e09ff3936cc7a26189c6c645b18672248f0a72722749ca4"} Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.814740 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.889766 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-utilities\") pod \"97e44b43-3c8e-4065-a51b-aa3f27c36712\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.889821 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-catalog-content\") pod \"97e44b43-3c8e-4065-a51b-aa3f27c36712\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.889875 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z44vv\" (UniqueName: \"kubernetes.io/projected/514049ae-2568-416f-9705-524c2bf74cbd-kube-api-access-z44vv\") pod \"514049ae-2568-416f-9705-524c2bf74cbd\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.889924 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-catalog-content\") pod \"514049ae-2568-416f-9705-524c2bf74cbd\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.890020 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6frr\" (UniqueName: \"kubernetes.io/projected/97e44b43-3c8e-4065-a51b-aa3f27c36712-kube-api-access-w6frr\") pod \"97e44b43-3c8e-4065-a51b-aa3f27c36712\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.890053 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-utilities\") pod \"514049ae-2568-416f-9705-524c2bf74cbd\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.890711 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-utilities" (OuterVolumeSpecName: "utilities") pod "97e44b43-3c8e-4065-a51b-aa3f27c36712" (UID: "97e44b43-3c8e-4065-a51b-aa3f27c36712"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.891133 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-utilities" (OuterVolumeSpecName: "utilities") pod "514049ae-2568-416f-9705-524c2bf74cbd" (UID: "514049ae-2568-416f-9705-524c2bf74cbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.896154 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e44b43-3c8e-4065-a51b-aa3f27c36712-kube-api-access-w6frr" (OuterVolumeSpecName: "kube-api-access-w6frr") pod "97e44b43-3c8e-4065-a51b-aa3f27c36712" (UID: "97e44b43-3c8e-4065-a51b-aa3f27c36712"). InnerVolumeSpecName "kube-api-access-w6frr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.903813 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514049ae-2568-416f-9705-524c2bf74cbd-kube-api-access-z44vv" (OuterVolumeSpecName: "kube-api-access-z44vv") pod "514049ae-2568-416f-9705-524c2bf74cbd" (UID: "514049ae-2568-416f-9705-524c2bf74cbd"). InnerVolumeSpecName "kube-api-access-z44vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.925414 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97e44b43-3c8e-4065-a51b-aa3f27c36712" (UID: "97e44b43-3c8e-4065-a51b-aa3f27c36712"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.992127 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6frr\" (UniqueName: \"kubernetes.io/projected/97e44b43-3c8e-4065-a51b-aa3f27c36712-kube-api-access-w6frr\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.992174 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.992193 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.992210 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.992229 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z44vv\" (UniqueName: \"kubernetes.io/projected/514049ae-2568-416f-9705-524c2bf74cbd-kube-api-access-z44vv\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.034780 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "514049ae-2568-416f-9705-524c2bf74cbd" (UID: "514049ae-2568-416f-9705-524c2bf74cbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.093422 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.155368 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rnj7"] Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.162290 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rnj7"] Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.170909 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dj7h5"] Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.174894 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dj7h5"] Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.589569 4781 scope.go:117] "RemoveContainer" containerID="9c12fe8df9037297d8af4eedba0d4e04fa1c5be02d943f1b25318346033b7fc9" Feb 27 00:10:23 crc kubenswrapper[4781]: I0227 00:10:23.316520 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="514049ae-2568-416f-9705-524c2bf74cbd" path="/var/lib/kubelet/pods/514049ae-2568-416f-9705-524c2bf74cbd/volumes" Feb 27 00:10:23 crc kubenswrapper[4781]: I0227 00:10:23.318828 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" path="/var/lib/kubelet/pods/97e44b43-3c8e-4065-a51b-aa3f27c36712/volumes" Feb 27 00:10:23 crc kubenswrapper[4781]: I0227 00:10:23.441770 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl"] Feb 27 00:10:23 crc kubenswrapper[4781]: W0227 00:10:23.948966 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb04ba107_8dd1_4d8d_88d3_5a762f6c60f1.slice/crio-f45bacdd853d75d92adb8b303b061914233c1920428a49de04d9a44ec247f82d WatchSource:0}: Error finding container f45bacdd853d75d92adb8b303b061914233c1920428a49de04d9a44ec247f82d: Status 404 returned error can't find the container with id f45bacdd853d75d92adb8b303b061914233c1920428a49de04d9a44ec247f82d Feb 27 00:10:24 crc kubenswrapper[4781]: I0227 00:10:24.008417 4781 scope.go:117] "RemoveContainer" containerID="b414a361ce30e28fdc5bc47f53f766e6427e2ccb8cfe76be4eed8ce4ee48ebca" Feb 27 00:10:24 crc kubenswrapper[4781]: I0227 00:10:24.088989 4781 scope.go:117] "RemoveContainer" containerID="6f6652462de4a86b8baaec52317db0d18a1393af64c4cae6b12430f98c10a218" Feb 27 00:10:24 crc kubenswrapper[4781]: I0227 00:10:24.136853 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:10:24 crc kubenswrapper[4781]: I0227 00:10:24.165923 4781 scope.go:117] "RemoveContainer" containerID="1701a618ae78c1968b5098401e32f2c349b0a0fd1ab9fdf4f23fd86a66112646" Feb 27 00:10:24 crc kubenswrapper[4781]: I0227 00:10:24.207029 4781 scope.go:117] "RemoveContainer" containerID="39f26f7fa9552ef0082d4338be84e32dc690ddb73a7ed4be83f09421026f56c7" Feb 27 00:10:24 crc kubenswrapper[4781]: I0227 00:10:24.842061 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" event={"ID":"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1","Type":"ContainerStarted","Data":"f45bacdd853d75d92adb8b303b061914233c1920428a49de04d9a44ec247f82d"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.852916 4781 generic.go:334] "Generic (PLEG): container finished" podID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerID="254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8" exitCode=0 Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.853093 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kztqg" event={"ID":"2b050e9e-d6c8-4e27-ad3f-9681553c1539","Type":"ContainerDied","Data":"254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.856804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqrgb" event={"ID":"ac30245d-7e42-440c-99a0-60e2ae15cb8b","Type":"ContainerStarted","Data":"ec7472b1d4abe3539fd2b9c6a74552c975f1e7a845d80d7f3684a0e55a838de1"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.861090 4781 generic.go:334] "Generic (PLEG): container finished" podID="19ed5401-2778-4266-8bf1-1c7244dac100" containerID="f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c" exitCode=0 Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.861175 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42hbx" event={"ID":"19ed5401-2778-4266-8bf1-1c7244dac100","Type":"ContainerDied","Data":"f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.862858 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" event={"ID":"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1","Type":"ContainerStarted","Data":"97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.863259 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.865676 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerStarted","Data":"170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.867746 4781 generic.go:334] "Generic (PLEG): container finished" podID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerID="4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f" exitCode=0 Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.867844 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52xgq" event={"ID":"0f286d62-2145-4bbb-91eb-28ffda9b2494","Type":"ContainerDied","Data":"4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.870451 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"0a9584e9887d3110a6a6d2ad5c5024fb38c734637c177fd2cbddb2eae4932cdc"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.874758 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.904361 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kqrgb" podStartSLOduration=7.773629081 podStartE2EDuration="1m14.904342472s" podCreationTimestamp="2026-02-27 00:09:11 +0000 UTC" firstStartedPulling="2026-02-27 00:09:12.932463446 +0000 UTC m=+222.190003000" lastFinishedPulling="2026-02-27 00:10:20.063176827 +0000 UTC m=+289.320716391" observedRunningTime="2026-02-27 00:10:25.901585663 +0000 UTC m=+295.159125217" watchObservedRunningTime="2026-02-27 00:10:25.904342472 +0000 UTC m=+295.161882036" Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.925240 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" podStartSLOduration=16.925222371 podStartE2EDuration="16.925222371s" podCreationTimestamp="2026-02-27 00:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:10:25.922192716 +0000 UTC m=+295.179732270" watchObservedRunningTime="2026-02-27 00:10:25.925222371 +0000 UTC m=+295.182761925" Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.963662 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hcdz5" podStartSLOduration=4.126813534 podStartE2EDuration="1m11.963645507s" podCreationTimestamp="2026-02-27 00:09:14 +0000 UTC" firstStartedPulling="2026-02-27 00:09:16.111037399 +0000 UTC m=+225.368576953" lastFinishedPulling="2026-02-27 00:10:23.947869342 +0000 UTC m=+293.205408926" observedRunningTime="2026-02-27 00:10:25.960659673 +0000 UTC m=+295.218199227" watchObservedRunningTime="2026-02-27 00:10:25.963645507 +0000 UTC m=+295.221185061" Feb 27 00:10:26 crc kubenswrapper[4781]: I0227 00:10:26.877347 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kztqg" event={"ID":"2b050e9e-d6c8-4e27-ad3f-9681553c1539","Type":"ContainerStarted","Data":"efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a"} Feb 27 00:10:26 crc kubenswrapper[4781]: I0227 00:10:26.879943 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42hbx" event={"ID":"19ed5401-2778-4266-8bf1-1c7244dac100","Type":"ContainerStarted","Data":"22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce"} Feb 27 00:10:26 crc kubenswrapper[4781]: I0227 00:10:26.882563 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52xgq" event={"ID":"0f286d62-2145-4bbb-91eb-28ffda9b2494","Type":"ContainerStarted","Data":"257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8"} Feb 27 00:10:26 crc kubenswrapper[4781]: I0227 00:10:26.917667 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kztqg" podStartSLOduration=2.353346412 podStartE2EDuration="1m15.917647129s" podCreationTimestamp="2026-02-27 00:09:11 +0000 UTC" firstStartedPulling="2026-02-27 00:09:13.005814103 +0000 UTC m=+222.263353657" lastFinishedPulling="2026-02-27 00:10:26.57011482 +0000 UTC m=+295.827654374" observedRunningTime="2026-02-27 00:10:26.899220351 +0000 UTC m=+296.156759905" watchObservedRunningTime="2026-02-27 00:10:26.917647129 +0000 UTC m=+296.175186693" Feb 27 00:10:26 crc kubenswrapper[4781]: I0227 00:10:26.918307 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-52xgq" podStartSLOduration=2.61737827 podStartE2EDuration="1m14.918301136s" podCreationTimestamp="2026-02-27 00:09:12 +0000 UTC" firstStartedPulling="2026-02-27 00:09:14.01346862 +0000 UTC m=+223.271008174" lastFinishedPulling="2026-02-27 00:10:26.314391486 +0000 UTC m=+295.571931040" observedRunningTime="2026-02-27 00:10:26.916513651 +0000 UTC m=+296.174053215" watchObservedRunningTime="2026-02-27 00:10:26.918301136 +0000 UTC m=+296.175840690" Feb 27 00:10:26 crc kubenswrapper[4781]: I0227 00:10:26.937751 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-42hbx" podStartSLOduration=2.604352457 podStartE2EDuration="1m15.937737439s" podCreationTimestamp="2026-02-27 00:09:11 +0000 UTC" firstStartedPulling="2026-02-27 00:09:12.937484181 +0000 UTC m=+222.195023735" lastFinishedPulling="2026-02-27 00:10:26.270869153 +0000 UTC m=+295.528408717" observedRunningTime="2026-02-27 00:10:26.934987611 +0000 UTC m=+296.192527165" watchObservedRunningTime="2026-02-27 00:10:26.937737439 +0000 UTC m=+296.195276993" Feb 27 00:10:29 crc kubenswrapper[4781]: I0227 00:10:29.691726 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dccb78d65-ddtvh"] Feb 27 00:10:29 crc kubenswrapper[4781]: I0227 00:10:29.692394 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" podUID="62880941-3b5d-4517-b0df-8c5548f8298d" containerName="controller-manager" containerID="cri-o://b8e38f1e216924c6d8d1fbab43dc9e21f5123054b598be25b535e4702b38b353" gracePeriod=30 Feb 27 00:10:29 crc kubenswrapper[4781]: I0227 00:10:29.780386 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl"] Feb 27 00:10:29 crc kubenswrapper[4781]: I0227 00:10:29.780598 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" podUID="b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" containerName="route-controller-manager" containerID="cri-o://97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5" gracePeriod=30 Feb 27 00:10:29 crc kubenswrapper[4781]: I0227 00:10:29.900787 4781 generic.go:334] "Generic (PLEG): container finished" podID="62880941-3b5d-4517-b0df-8c5548f8298d" containerID="b8e38f1e216924c6d8d1fbab43dc9e21f5123054b598be25b535e4702b38b353" exitCode=0 Feb 27 00:10:29 crc kubenswrapper[4781]: I0227 00:10:29.901082 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" event={"ID":"62880941-3b5d-4517-b0df-8c5548f8298d","Type":"ContainerDied","Data":"b8e38f1e216924c6d8d1fbab43dc9e21f5123054b598be25b535e4702b38b353"} Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.263736 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.271038 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317327 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txcb7\" (UniqueName: \"kubernetes.io/projected/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-kube-api-access-txcb7\") pod \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317384 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-serving-cert\") pod \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317411 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-client-ca\") pod \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317438 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62880941-3b5d-4517-b0df-8c5548f8298d-serving-cert\") pod \"62880941-3b5d-4517-b0df-8c5548f8298d\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317507 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-config\") pod \"62880941-3b5d-4517-b0df-8c5548f8298d\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317524 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-proxy-ca-bundles\") pod \"62880941-3b5d-4517-b0df-8c5548f8298d\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317545 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-config\") pod \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317569 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-client-ca\") pod \"62880941-3b5d-4517-b0df-8c5548f8298d\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317603 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6rqm\" (UniqueName: \"kubernetes.io/projected/62880941-3b5d-4517-b0df-8c5548f8298d-kube-api-access-h6rqm\") pod \"62880941-3b5d-4517-b0df-8c5548f8298d\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.318769 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "62880941-3b5d-4517-b0df-8c5548f8298d" (UID: "62880941-3b5d-4517-b0df-8c5548f8298d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.318784 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-config" (OuterVolumeSpecName: "config") pod "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" (UID: "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.318775 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-client-ca" (OuterVolumeSpecName: "client-ca") pod "62880941-3b5d-4517-b0df-8c5548f8298d" (UID: "62880941-3b5d-4517-b0df-8c5548f8298d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.318836 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-config" (OuterVolumeSpecName: "config") pod "62880941-3b5d-4517-b0df-8c5548f8298d" (UID: "62880941-3b5d-4517-b0df-8c5548f8298d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.319238 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" (UID: "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.323360 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" (UID: "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.323391 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62880941-3b5d-4517-b0df-8c5548f8298d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "62880941-3b5d-4517-b0df-8c5548f8298d" (UID: "62880941-3b5d-4517-b0df-8c5548f8298d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.323442 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-kube-api-access-txcb7" (OuterVolumeSpecName: "kube-api-access-txcb7") pod "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" (UID: "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1"). InnerVolumeSpecName "kube-api-access-txcb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.329902 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62880941-3b5d-4517-b0df-8c5548f8298d-kube-api-access-h6rqm" (OuterVolumeSpecName: "kube-api-access-h6rqm") pod "62880941-3b5d-4517-b0df-8c5548f8298d" (UID: "62880941-3b5d-4517-b0df-8c5548f8298d"). InnerVolumeSpecName "kube-api-access-h6rqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418676 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418720 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418734 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418749 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418762 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6rqm\" (UniqueName: \"kubernetes.io/projected/62880941-3b5d-4517-b0df-8c5548f8298d-kube-api-access-h6rqm\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418775 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txcb7\" (UniqueName: \"kubernetes.io/projected/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-kube-api-access-txcb7\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418787 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418799 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418810 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62880941-3b5d-4517-b0df-8c5548f8298d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.912154 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.912160 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" event={"ID":"62880941-3b5d-4517-b0df-8c5548f8298d","Type":"ContainerDied","Data":"b10681d24475519937e6ebec881f7fb610590065caed7686034ef5962eec396d"} Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.912253 4781 scope.go:117] "RemoveContainer" containerID="b8e38f1e216924c6d8d1fbab43dc9e21f5123054b598be25b535e4702b38b353" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.914674 4781 generic.go:334] "Generic (PLEG): container finished" podID="b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" containerID="97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5" exitCode=0 Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.914748 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.914757 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" event={"ID":"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1","Type":"ContainerDied","Data":"97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5"} Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.914801 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" event={"ID":"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1","Type":"ContainerDied","Data":"f45bacdd853d75d92adb8b303b061914233c1920428a49de04d9a44ec247f82d"} Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.958503 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dccb78d65-ddtvh"] Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.958801 4781 scope.go:117] "RemoveContainer" containerID="97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.961467 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6dccb78d65-ddtvh"] Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.976840 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl"] Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.983071 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl"] Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.989134 4781 scope.go:117] "RemoveContainer" containerID="97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5" Feb 27 00:10:30 crc kubenswrapper[4781]: E0227 00:10:30.990135 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5\": container with ID starting with 97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5 not found: ID does not exist" containerID="97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.990174 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5"} err="failed to get container status \"97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5\": rpc error: code = NotFound desc = could not find container \"97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5\": container with ID starting with 97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5 not found: ID does not exist" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.062758 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.150675 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b77c846cc-7b4k9"] Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151659 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="registry-server" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151702 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="registry-server" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151729 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" containerName="route-controller-manager" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151749 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" containerName="route-controller-manager" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151790 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="extract-utilities" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151810 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="extract-utilities" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151838 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="extract-content" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151856 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="extract-content" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151884 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="registry-server" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151903 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="registry-server" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151933 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="extract-content" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151953 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="extract-content" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151979 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="extract-utilities" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151998 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="extract-utilities" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.152023 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62880941-3b5d-4517-b0df-8c5548f8298d" containerName="controller-manager" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.152043 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="62880941-3b5d-4517-b0df-8c5548f8298d" containerName="controller-manager" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.152302 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="registry-server" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.152348 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" containerName="route-controller-manager" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.152373 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="registry-server" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.152405 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="62880941-3b5d-4517-b0df-8c5548f8298d" containerName="controller-manager" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.153386 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.154512 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c"] Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.155320 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.160134 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.160524 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.163621 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.164183 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.164439 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.164705 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.164988 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.166297 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.166439 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.166569 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.169392 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.169611 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.182210 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.197750 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b77c846cc-7b4k9"] Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.201429 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c"] Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.319283 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62880941-3b5d-4517-b0df-8c5548f8298d" path="/var/lib/kubelet/pods/62880941-3b5d-4517-b0df-8c5548f8298d/volumes" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.319851 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" path="/var/lib/kubelet/pods/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1/volumes" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.344475 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b72f5f5-6a79-467f-b65c-4079430ea22c-client-ca\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.344742 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b72f5f5-6a79-467f-b65c-4079430ea22c-config\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.344828 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b72f5f5-6a79-467f-b65c-4079430ea22c-serving-cert\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.344903 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-client-ca\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.344991 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-config\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.345096 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4stn\" (UniqueName: \"kubernetes.io/projected/5b72f5f5-6a79-467f-b65c-4079430ea22c-kube-api-access-n4stn\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.345175 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h7w6\" (UniqueName: \"kubernetes.io/projected/3849839b-223f-4c16-8aca-0f7b82e30586-kube-api-access-9h7w6\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.345241 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-proxy-ca-bundles\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.345320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3849839b-223f-4c16-8aca-0f7b82e30586-serving-cert\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446701 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h7w6\" (UniqueName: \"kubernetes.io/projected/3849839b-223f-4c16-8aca-0f7b82e30586-kube-api-access-9h7w6\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446744 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-proxy-ca-bundles\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446774 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3849839b-223f-4c16-8aca-0f7b82e30586-serving-cert\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446836 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b72f5f5-6a79-467f-b65c-4079430ea22c-config\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446858 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b72f5f5-6a79-467f-b65c-4079430ea22c-client-ca\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446885 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b72f5f5-6a79-467f-b65c-4079430ea22c-serving-cert\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446913 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-client-ca\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446947 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-config\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446993 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4stn\" (UniqueName: \"kubernetes.io/projected/5b72f5f5-6a79-467f-b65c-4079430ea22c-kube-api-access-n4stn\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.455687 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.455983 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.456102 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.457080 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.458355 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b72f5f5-6a79-467f-b65c-4079430ea22c-config\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.459142 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.459168 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-config\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.459243 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.459374 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.459983 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-client-ca\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.460115 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b72f5f5-6a79-467f-b65c-4079430ea22c-client-ca\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.464495 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b72f5f5-6a79-467f-b65c-4079430ea22c-serving-cert\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.469368 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-proxy-ca-bundles\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.470376 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.471931 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.474923 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3849839b-223f-4c16-8aca-0f7b82e30586-serving-cert\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.481154 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.483142 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.492513 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4stn\" (UniqueName: \"kubernetes.io/projected/5b72f5f5-6a79-467f-b65c-4079430ea22c-kube-api-access-n4stn\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.496545 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h7w6\" (UniqueName: \"kubernetes.io/projected/3849839b-223f-4c16-8aca-0f7b82e30586-kube-api-access-9h7w6\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.508476 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.515331 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.733824 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c"] Feb 27 00:10:31 crc kubenswrapper[4781]: W0227 00:10:31.749960 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b72f5f5_6a79_467f_b65c_4079430ea22c.slice/crio-3505b3780dca1dddfaee8c72b7f6f032411eb02bef4488e644d77b4d57ca8cb1 WatchSource:0}: Error finding container 3505b3780dca1dddfaee8c72b7f6f032411eb02bef4488e644d77b4d57ca8cb1: Status 404 returned error can't find the container with id 3505b3780dca1dddfaee8c72b7f6f032411eb02bef4488e644d77b4d57ca8cb1 Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.791542 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.800841 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.846423 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.847052 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.909992 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.923075 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" event={"ID":"5b72f5f5-6a79-467f-b65c-4079430ea22c","Type":"ContainerStarted","Data":"3505b3780dca1dddfaee8c72b7f6f032411eb02bef4488e644d77b4d57ca8cb1"} Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.972582 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.992075 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.992150 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.041693 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.192896 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.192944 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.238415 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.294455 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b77c846cc-7b4k9"] Feb 27 00:10:32 crc kubenswrapper[4781]: W0227 00:10:32.295246 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3849839b_223f_4c16_8aca_0f7b82e30586.slice/crio-26f5df4976ebf4d2761f889a8f03fbf5e4ed1b2fd4e3be10a9c60287deb69254 WatchSource:0}: Error finding container 26f5df4976ebf4d2761f889a8f03fbf5e4ed1b2fd4e3be10a9c60287deb69254: Status 404 returned error can't find the container with id 26f5df4976ebf4d2761f889a8f03fbf5e4ed1b2fd4e3be10a9c60287deb69254 Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.441664 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.441746 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.484128 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.927793 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" event={"ID":"3849839b-223f-4c16-8aca-0f7b82e30586","Type":"ContainerStarted","Data":"46f45b8f3a4098b7387e66d8e638b26e0546877ca1f6e13ddb0da4ec6df1e284"} Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.928127 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" event={"ID":"3849839b-223f-4c16-8aca-0f7b82e30586","Type":"ContainerStarted","Data":"26f5df4976ebf4d2761f889a8f03fbf5e4ed1b2fd4e3be10a9c60287deb69254"} Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.928806 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.931999 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" event={"ID":"5b72f5f5-6a79-467f-b65c-4079430ea22c","Type":"ContainerStarted","Data":"8a239fbb3c7f0ce079f37f476f358d6bc0c3a1ec38e0b9805a63bb642f15f6cf"} Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.932042 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.933895 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.939055 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.967830 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" podStartSLOduration=3.967809545 podStartE2EDuration="3.967809545s" podCreationTimestamp="2026-02-27 00:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:10:32.949539061 +0000 UTC m=+302.207078615" watchObservedRunningTime="2026-02-27 00:10:32.967809545 +0000 UTC m=+302.225349099" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.976153 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.988344 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" podStartSLOduration=3.988325816 podStartE2EDuration="3.988325816s" podCreationTimestamp="2026-02-27 00:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:10:32.983188658 +0000 UTC m=+302.240728222" watchObservedRunningTime="2026-02-27 00:10:32.988325816 +0000 UTC m=+302.245865380" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.989043 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.994189 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:10:33 crc kubenswrapper[4781]: I0227 00:10:33.596563 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zhrk"] Feb 27 00:10:34 crc kubenswrapper[4781]: I0227 00:10:34.740782 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-52xgq"] Feb 27 00:10:34 crc kubenswrapper[4781]: I0227 00:10:34.941823 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-52xgq" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="registry-server" containerID="cri-o://257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8" gracePeriod=2 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.004218 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.004288 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.065420 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.466119 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.605555 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8558\" (UniqueName: \"kubernetes.io/projected/0f286d62-2145-4bbb-91eb-28ffda9b2494-kube-api-access-f8558\") pod \"0f286d62-2145-4bbb-91eb-28ffda9b2494\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.605717 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-catalog-content\") pod \"0f286d62-2145-4bbb-91eb-28ffda9b2494\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.605813 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-utilities\") pod \"0f286d62-2145-4bbb-91eb-28ffda9b2494\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.606668 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-utilities" (OuterVolumeSpecName: "utilities") pod "0f286d62-2145-4bbb-91eb-28ffda9b2494" (UID: "0f286d62-2145-4bbb-91eb-28ffda9b2494"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.616606 4781 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.617044 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="registry-server" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.617057 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="registry-server" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.617071 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="extract-content" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.617077 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="extract-content" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.617088 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="extract-utilities" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.617094 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="extract-utilities" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.617197 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="registry-server" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.617543 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.618525 4781 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.618847 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c" gracePeriod=15 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.618871 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d" gracePeriod=15 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.618887 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226" gracePeriod=15 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.618916 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b" gracePeriod=15 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.618984 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5" gracePeriod=15 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.624764 4781 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625164 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625189 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625210 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625225 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625243 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625256 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625285 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625302 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625322 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625334 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625355 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625368 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625388 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625400 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625418 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625431 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625448 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625461 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625481 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625494 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625728 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625750 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625772 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625791 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625811 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625829 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625850 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625882 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.626262 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.627751 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f286d62-2145-4bbb-91eb-28ffda9b2494-kube-api-access-f8558" (OuterVolumeSpecName: "kube-api-access-f8558") pod "0f286d62-2145-4bbb-91eb-28ffda9b2494" (UID: "0f286d62-2145-4bbb-91eb-28ffda9b2494"). InnerVolumeSpecName "kube-api-access-f8558". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.673177 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f286d62-2145-4bbb-91eb-28ffda9b2494" (UID: "0f286d62-2145-4bbb-91eb-28ffda9b2494"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.707887 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.707931 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.707955 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.707973 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.707992 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.708052 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.708085 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.708107 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.708164 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.708175 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8558\" (UniqueName: \"kubernetes.io/projected/0f286d62-2145-4bbb-91eb-28ffda9b2494-kube-api-access-f8558\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.708184 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809244 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809295 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809320 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809338 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809370 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809381 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809425 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809392 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809486 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809468 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809525 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809545 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809590 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.950786 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.952914 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.953876 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226" exitCode=0 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.953914 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b" exitCode=0 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.953932 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d" exitCode=0 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.953948 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5" exitCode=2 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.954019 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.956231 4781 generic.go:334] "Generic (PLEG): container finished" podID="7c8795e9-9244-4cc4-a297-3aec68bf3588" containerID="2c434d493ffe4d5672fb6269468215eb15ce1d96ef38aac19ec03432d5d7c9b5" exitCode=0 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.956314 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7c8795e9-9244-4cc4-a297-3aec68bf3588","Type":"ContainerDied","Data":"2c434d493ffe4d5672fb6269468215eb15ce1d96ef38aac19ec03432d5d7c9b5"} Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.957096 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.957733 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.961522 4781 generic.go:334] "Generic (PLEG): container finished" podID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerID="257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8" exitCode=0 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.961593 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.961668 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52xgq" event={"ID":"0f286d62-2145-4bbb-91eb-28ffda9b2494","Type":"ContainerDied","Data":"257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8"} Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.961754 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52xgq" event={"ID":"0f286d62-2145-4bbb-91eb-28ffda9b2494","Type":"ContainerDied","Data":"dc9d59b8ab934cad32f1842b836646a3832e9408664f5c6c345f309f196516de"} Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.962318 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.962735 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.963266 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.978544 4781 scope.go:117] "RemoveContainer" containerID="257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.987531 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.989372 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.990121 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.005054 4781 scope.go:117] "RemoveContainer" containerID="4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.016076 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.016896 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.017249 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.017747 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.018366 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.032442 4781 scope.go:117] "RemoveContainer" containerID="0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.050980 4781 scope.go:117] "RemoveContainer" containerID="257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8" Feb 27 00:10:36 crc kubenswrapper[4781]: E0227 00:10:36.051536 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8\": container with ID starting with 257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8 not found: ID does not exist" containerID="257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.051582 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8"} err="failed to get container status \"257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8\": rpc error: code = NotFound desc = could not find container \"257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8\": container with ID starting with 257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8 not found: ID does not exist" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.051617 4781 scope.go:117] "RemoveContainer" containerID="4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f" Feb 27 00:10:36 crc kubenswrapper[4781]: E0227 00:10:36.052220 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f\": container with ID starting with 4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f not found: ID does not exist" containerID="4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.052267 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f"} err="failed to get container status \"4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f\": rpc error: code = NotFound desc = could not find container \"4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f\": container with ID starting with 4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f not found: ID does not exist" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.052297 4781 scope.go:117] "RemoveContainer" containerID="0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce" Feb 27 00:10:36 crc kubenswrapper[4781]: E0227 00:10:36.052588 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce\": container with ID starting with 0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce not found: ID does not exist" containerID="0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.052654 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce"} err="failed to get container status \"0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce\": rpc error: code = NotFound desc = could not find container \"0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce\": container with ID starting with 0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce not found: ID does not exist" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.975093 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 00:10:37 crc kubenswrapper[4781]: E0227 00:10:37.336088 4781 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" volumeName="registry-storage" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.407812 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.409830 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.410533 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.411014 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.440123 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-kubelet-dir\") pod \"7c8795e9-9244-4cc4-a297-3aec68bf3588\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.440568 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c8795e9-9244-4cc4-a297-3aec68bf3588-kube-api-access\") pod \"7c8795e9-9244-4cc4-a297-3aec68bf3588\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.440877 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-var-lock\") pod \"7c8795e9-9244-4cc4-a297-3aec68bf3588\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.440306 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7c8795e9-9244-4cc4-a297-3aec68bf3588" (UID: "7c8795e9-9244-4cc4-a297-3aec68bf3588"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.441591 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-var-lock" (OuterVolumeSpecName: "var-lock") pod "7c8795e9-9244-4cc4-a297-3aec68bf3588" (UID: "7c8795e9-9244-4cc4-a297-3aec68bf3588"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.446776 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c8795e9-9244-4cc4-a297-3aec68bf3588-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7c8795e9-9244-4cc4-a297-3aec68bf3588" (UID: "7c8795e9-9244-4cc4-a297-3aec68bf3588"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.542900 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.542934 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c8795e9-9244-4cc4-a297-3aec68bf3588-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.542945 4781 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-var-lock\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.985051 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7c8795e9-9244-4cc4-a297-3aec68bf3588","Type":"ContainerDied","Data":"f39f3390eb5e42a403a575333d110cbe5ece9b7617819b5c4f74d934848ba9f2"} Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.985530 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f39f3390eb5e42a403a575333d110cbe5ece9b7617819b5c4f74d934848ba9f2" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.985111 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.988367 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.989345 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.989489 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.989902 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.990346 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.990745 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c" exitCode=0 Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.990736 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.990783 4781 scope.go:117] "RemoveContainer" containerID="b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.991457 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.006723 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.007251 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.007704 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.008135 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.008581 4781 scope.go:117] "RemoveContainer" containerID="0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.022593 4781 scope.go:117] "RemoveContainer" containerID="543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.040767 4781 scope.go:117] "RemoveContainer" containerID="6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.046842 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.046901 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.046949 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.046961 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.047015 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.047114 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.047228 4781 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.047257 4781 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.047275 4781 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.060017 4781 scope.go:117] "RemoveContainer" containerID="4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.078538 4781 scope.go:117] "RemoveContainer" containerID="240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.095416 4781 scope.go:117] "RemoveContainer" containerID="b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226" Feb 27 00:10:38 crc kubenswrapper[4781]: E0227 00:10:38.095836 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\": container with ID starting with b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226 not found: ID does not exist" containerID="b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.095960 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226"} err="failed to get container status \"b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\": rpc error: code = NotFound desc = could not find container \"b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\": container with ID starting with b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226 not found: ID does not exist" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.095990 4781 scope.go:117] "RemoveContainer" containerID="0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b" Feb 27 00:10:38 crc kubenswrapper[4781]: E0227 00:10:38.096195 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\": container with ID starting with 0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b not found: ID does not exist" containerID="0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.096232 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b"} err="failed to get container status \"0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\": rpc error: code = NotFound desc = could not find container \"0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\": container with ID starting with 0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b not found: ID does not exist" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.096247 4781 scope.go:117] "RemoveContainer" containerID="543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d" Feb 27 00:10:38 crc kubenswrapper[4781]: E0227 00:10:38.096579 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\": container with ID starting with 543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d not found: ID does not exist" containerID="543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.096612 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d"} err="failed to get container status \"543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\": rpc error: code = NotFound desc = could not find container \"543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\": container with ID starting with 543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d not found: ID does not exist" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.096654 4781 scope.go:117] "RemoveContainer" containerID="6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5" Feb 27 00:10:38 crc kubenswrapper[4781]: E0227 00:10:38.097075 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\": container with ID starting with 6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5 not found: ID does not exist" containerID="6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.097110 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5"} err="failed to get container status \"6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\": rpc error: code = NotFound desc = could not find container \"6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\": container with ID starting with 6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5 not found: ID does not exist" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.097130 4781 scope.go:117] "RemoveContainer" containerID="4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c" Feb 27 00:10:38 crc kubenswrapper[4781]: E0227 00:10:38.097476 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\": container with ID starting with 4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c not found: ID does not exist" containerID="4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.097499 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c"} err="failed to get container status \"4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\": rpc error: code = NotFound desc = could not find container \"4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\": container with ID starting with 4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c not found: ID does not exist" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.097516 4781 scope.go:117] "RemoveContainer" containerID="240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3" Feb 27 00:10:38 crc kubenswrapper[4781]: E0227 00:10:38.097794 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\": container with ID starting with 240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3 not found: ID does not exist" containerID="240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.097811 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3"} err="failed to get container status \"240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\": rpc error: code = NotFound desc = could not find container \"240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\": container with ID starting with 240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3 not found: ID does not exist" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.996777 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:39 crc kubenswrapper[4781]: I0227 00:10:39.014015 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:39 crc kubenswrapper[4781]: I0227 00:10:39.014335 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:39 crc kubenswrapper[4781]: I0227 00:10:39.014761 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:39 crc kubenswrapper[4781]: I0227 00:10:39.015081 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:39 crc kubenswrapper[4781]: I0227 00:10:39.315947 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 27 00:10:40 crc kubenswrapper[4781]: E0227 00:10:40.656574 4781 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:40 crc kubenswrapper[4781]: I0227 00:10:40.657376 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:40 crc kubenswrapper[4781]: W0227 00:10:40.697563 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-2ce9262d5fb63261e35b2cd7782ae4d70d1a8c54505505b786cc12676dea8783 WatchSource:0}: Error finding container 2ce9262d5fb63261e35b2cd7782ae4d70d1a8c54505505b786cc12676dea8783: Status 404 returned error can't find the container with id 2ce9262d5fb63261e35b2cd7782ae4d70d1a8c54505505b786cc12676dea8783 Feb 27 00:10:40 crc kubenswrapper[4781]: E0227 00:10:40.704374 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897f1fb755d17ad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:10:40.701700013 +0000 UTC m=+309.959239607,LastTimestamp:2026-02-27 00:10:40.701700013 +0000 UTC m=+309.959239607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:10:41 crc kubenswrapper[4781]: I0227 00:10:41.014116 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2ce9262d5fb63261e35b2cd7782ae4d70d1a8c54505505b786cc12676dea8783"} Feb 27 00:10:41 crc kubenswrapper[4781]: I0227 00:10:41.318372 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:41 crc kubenswrapper[4781]: I0227 00:10:41.318925 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:41 crc kubenswrapper[4781]: I0227 00:10:41.319506 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: I0227 00:10:42.026345 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae"} Feb 27 00:10:42 crc kubenswrapper[4781]: I0227 00:10:42.027429 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.027563 4781 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:42 crc kubenswrapper[4781]: I0227 00:10:42.028206 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: I0227 00:10:42.028898 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.753578 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.754804 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.755505 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.756022 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.756544 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: I0227 00:10:42.756609 4781 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.757115 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="200ms" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.958700 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="400ms" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.033767 4781 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.359917 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="800ms" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.567385 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:10:43Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:10:43Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:10:43Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:10:43Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.567918 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.568370 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.568644 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.568930 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.568956 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:10:44 crc kubenswrapper[4781]: E0227 00:10:44.161475 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="1.6s" Feb 27 00:10:44 crc kubenswrapper[4781]: E0227 00:10:44.511950 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897f1fb755d17ad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:10:40.701700013 +0000 UTC m=+309.959239607,LastTimestamp:2026-02-27 00:10:40.701700013 +0000 UTC m=+309.959239607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:10:45 crc kubenswrapper[4781]: E0227 00:10:45.762154 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="3.2s" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.308730 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.310578 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.312401 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.312964 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.328400 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.328445 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:47 crc kubenswrapper[4781]: E0227 00:10:47.328972 4781 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.330054 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.065600 4781 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d67e71348f506acedfd49fdb01193f240453b3872fba7f2a05afdc150dc45413" exitCode=0 Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.065700 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d67e71348f506acedfd49fdb01193f240453b3872fba7f2a05afdc150dc45413"} Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.065753 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f2ce1cb1f0fff81502f09902c57e6d3b5f4c1bac68a1f8eb9b62c40a247d6e03"} Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.066323 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.066381 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:48 crc kubenswrapper[4781]: E0227 00:10:48.066928 4781 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.066971 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.067683 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.068160 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.079092 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"73b4ddb5f02dd3ec03c7150c2b525684a96ea163e30959b1ed9e9a9674ccf851"} Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.079408 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"552e0f533e4924e6179bea2f9c4b1fb5e7d0546b88086c647c01546499d8c0d4"} Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.079419 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3e1e3d72e819f646e1d954dce8e87cc4f81172c85f1fe1c9c812b5e508ca8ebf"} Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.079426 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3443d9392221569c894c37e3e3dc0d3d0be3bd7cf0464ea3ca776bc75e620169"} Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.081916 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.082535 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.082581 4781 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477" exitCode=1 Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.082623 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477"} Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.083130 4781 scope.go:117] "RemoveContainer" containerID="9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477" Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.218091 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.096244 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.097468 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.097615 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"74e637562fda9341501dc3a4f8ff7bcfb06a0ef864a2010d7f005ad8286d96b1"} Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.101956 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bf722891fbe8788a067a754a957ea026cbd8eccc9ffa5377e7b75e0242c2f3d1"} Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.102242 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.102361 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.102393 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:52 crc kubenswrapper[4781]: I0227 00:10:52.330537 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:52 crc kubenswrapper[4781]: I0227 00:10:52.330927 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:52 crc kubenswrapper[4781]: I0227 00:10:52.338060 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.002349 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.002554 4781 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.002766 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.113336 4781 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.132421 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.132450 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.137479 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.139669 4781 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="456df73e-2d2d-4980-b2e6-9e45d9cd002b" Feb 27 00:10:56 crc kubenswrapper[4781]: I0227 00:10:56.138186 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:56 crc kubenswrapper[4781]: I0227 00:10:56.138222 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:58 crc kubenswrapper[4781]: I0227 00:10:58.633429 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerName="oauth-openshift" containerID="cri-o://126cdc4a0b2277218672d3ebb95a09f31d56ce658295a2f26c3118300b77f809" gracePeriod=15 Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.158083 4781 generic.go:334] "Generic (PLEG): container finished" podID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerID="126cdc4a0b2277218672d3ebb95a09f31d56ce658295a2f26c3118300b77f809" exitCode=0 Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.158141 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" event={"ID":"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b","Type":"ContainerDied","Data":"126cdc4a0b2277218672d3ebb95a09f31d56ce658295a2f26c3118300b77f809"} Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.218707 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.227175 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-login\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366565 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-router-certs\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366615 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-ocp-branding-template\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366685 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-policies\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366730 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-provider-selection\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366789 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-cliconfig\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366843 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-error\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366876 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr2kq\" (UniqueName: \"kubernetes.io/projected/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-kube-api-access-kr2kq\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366911 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-idp-0-file-data\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366947 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-trusted-ca-bundle\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366974 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-session\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.367005 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-serving-cert\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.367047 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-dir\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.367081 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-service-ca\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.368430 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.368458 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.368500 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.368446 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.368551 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.372903 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.373428 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.373657 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.374027 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.374203 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-kube-api-access-kr2kq" (OuterVolumeSpecName: "kube-api-access-kr2kq") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "kube-api-access-kr2kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.374395 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.374905 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.375150 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.375778 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.470754 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471022 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr2kq\" (UniqueName: \"kubernetes.io/projected/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-kube-api-access-kr2kq\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471210 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471305 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471394 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471481 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471583 4781 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471721 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471818 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471908 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.472025 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.472131 4781 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.472225 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.472316 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:00 crc kubenswrapper[4781]: I0227 00:11:00.167184 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" event={"ID":"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b","Type":"ContainerDied","Data":"8fca182de0e27c23c7179b6016938f296917be6dc1ab0f25ef6496e1df363d8a"} Feb 27 00:11:00 crc kubenswrapper[4781]: I0227 00:11:00.167253 4781 scope.go:117] "RemoveContainer" containerID="126cdc4a0b2277218672d3ebb95a09f31d56ce658295a2f26c3118300b77f809" Feb 27 00:11:00 crc kubenswrapper[4781]: I0227 00:11:00.167256 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:11:01 crc kubenswrapper[4781]: I0227 00:11:01.319152 4781 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="456df73e-2d2d-4980-b2e6-9e45d9cd002b" Feb 27 00:11:05 crc kubenswrapper[4781]: I0227 00:11:05.340932 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:11:05 crc kubenswrapper[4781]: I0227 00:11:05.347816 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:11:05 crc kubenswrapper[4781]: I0227 00:11:05.463455 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 27 00:11:05 crc kubenswrapper[4781]: I0227 00:11:05.853500 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 27 00:11:05 crc kubenswrapper[4781]: I0227 00:11:05.859814 4781 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.243683 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.374319 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.389298 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.400163 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.541304 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.543272 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.592586 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.633451 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.884454 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.888136 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.127916 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.140329 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.527359 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.552854 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.700210 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.797956 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.844959 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.885160 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.918769 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.930747 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.215846 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.228412 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.536563 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.654442 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.777087 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.861663 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.895441 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.978787 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.071396 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.125347 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.382075 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.510582 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.646605 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.883595 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.980460 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.017208 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.051936 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.065120 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.117382 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.132753 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.286060 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.286117 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.408914 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.467324 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.544834 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.648975 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.660974 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.677492 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.699985 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.713510 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.754368 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.811090 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.873792 4781 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.955086 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.987389 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.114876 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.186401 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.244272 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.245039 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.326561 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.343978 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.374148 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.492027 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.500149 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.536262 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.601995 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.701983 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.755186 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.807319 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.812173 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.844485 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.846778 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.874344 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.931473 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.009849 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.252727 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.345996 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.355583 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.473957 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.535798 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.639893 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.685213 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.750124 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.799790 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.890329 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.937456 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.085817 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.246384 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.259946 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.377405 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.382949 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.430009 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.481679 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.495709 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.529830 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.569064 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.581255 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.706909 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.723376 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.830214 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.851730 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.940687 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.953844 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.052226 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.072239 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.176387 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.197263 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.241605 4781 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.253922 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.277662 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.368898 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.418173 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.438098 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.449076 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.480969 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.527516 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.654311 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.725821 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.750951 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.762119 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.787929 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.814970 4781 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.823609 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.835509 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.936743 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.942768 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.997064 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.064851 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.066794 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.218052 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.363788 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.431072 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.476198 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.550007 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.614417 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.675803 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.740663 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.760136 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.767703 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.935448 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.972681 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.983105 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.011884 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.034777 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.048427 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.131436 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.132431 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.209080 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.269507 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.431062 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.481978 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.505800 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.706923 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.772937 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.790698 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.843689 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.015604 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.111746 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.200298 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.318540 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.325438 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.397296 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.406322 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.436353 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.480115 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.480539 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.538177 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.642289 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.698756 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.726999 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.794817 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.803235 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.889057 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.896836 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.136683 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.203456 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.345781 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.549484 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.675247 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.698244 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.723180 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.727291 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.742041 4781 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.807290 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.813731 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.865549 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.915226 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.165018 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.208297 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.319342 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.476489 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.520309 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.603340 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.628007 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.638707 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.670098 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.803947 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.865579 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.932272 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.944273 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.035939 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.068046 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.141869 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.146316 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.428499 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.551644 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.729926 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.813465 4781 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.817925 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-2zhrk","openshift-marketplace/certified-operators-52xgq"] Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.817995 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-79d557fb64-sq8tq"] Feb 27 00:11:20 crc kubenswrapper[4781]: E0227 00:11:20.818161 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" containerName="installer" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818174 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" containerName="installer" Feb 27 00:11:20 crc kubenswrapper[4781]: E0227 00:11:20.818190 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerName="oauth-openshift" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818197 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerName="oauth-openshift" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818378 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818400 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818440 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" containerName="installer" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818458 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerName="oauth-openshift" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818856 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.823070 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.823443 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.823608 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.823697 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.823852 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.824017 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.824139 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.824296 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.824613 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.826472 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.826823 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.826945 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.832553 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.834849 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.841957 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.852214 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.858807 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.858774839 podStartE2EDuration="25.858774839s" podCreationTimestamp="2026-02-27 00:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:11:20.849792887 +0000 UTC m=+350.107332541" watchObservedRunningTime="2026-02-27 00:11:20.858774839 +0000 UTC m=+350.116314433" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.872838 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.946809 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.954718 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.954774 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.954813 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.954852 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.954971 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955022 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-login\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955062 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955098 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffe89698-729a-4a15-92c3-3a095a00fb26-audit-dir\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955155 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955210 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-session\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955384 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-audit-policies\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955477 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm28d\" (UniqueName: \"kubernetes.io/projected/ffe89698-729a-4a15-92c3-3a095a00fb26-kube-api-access-cm28d\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955552 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-error\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955672 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.027980 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.056767 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-session\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.056847 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-audit-policies\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.056883 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm28d\" (UniqueName: \"kubernetes.io/projected/ffe89698-729a-4a15-92c3-3a095a00fb26-kube-api-access-cm28d\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.056934 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-error\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.056977 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057028 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057072 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057108 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057144 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057202 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057237 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-login\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057278 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057312 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffe89698-729a-4a15-92c3-3a095a00fb26-audit-dir\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057357 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057527 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffe89698-729a-4a15-92c3-3a095a00fb26-audit-dir\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.058475 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.058869 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.058934 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.059925 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-audit-policies\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.065724 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.066141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.066472 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.066710 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-error\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.066814 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.068352 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-session\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.069254 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.070386 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-login\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.080402 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm28d\" (UniqueName: \"kubernetes.io/projected/ffe89698-729a-4a15-92c3-3a095a00fb26-kube-api-access-cm28d\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.119528 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.146843 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.257901 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.317103 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" path="/var/lib/kubelet/pods/0f286d62-2145-4bbb-91eb-28ffda9b2494/volumes" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.318273 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" path="/var/lib/kubelet/pods/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b/volumes" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.339071 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.491984 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.492319 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.611420 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79d557fb64-sq8tq"] Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.693507 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.750857 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.789699 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.972333 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.974144 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.077190 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.317488 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" event={"ID":"ffe89698-729a-4a15-92c3-3a095a00fb26","Type":"ContainerStarted","Data":"ee4c674b04255d04ca76a5be26bbbed3a63bf79f43764cb7a910fad54e7db94f"} Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.317536 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" event={"ID":"ffe89698-729a-4a15-92c3-3a095a00fb26","Type":"ContainerStarted","Data":"fd787affcc48f57305115dbaeabfdf2a6725e9721c9f1989d94e4cbcaf0bdefd"} Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.317875 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.318227 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.448579 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.465766 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" podStartSLOduration=49.465740896 podStartE2EDuration="49.465740896s" podCreationTimestamp="2026-02-27 00:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:11:22.347475361 +0000 UTC m=+351.605014905" watchObservedRunningTime="2026-02-27 00:11:22.465740896 +0000 UTC m=+351.723280480" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.488540 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.516211 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.530112 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.631929 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.739300 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.924679 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 27 00:11:23 crc kubenswrapper[4781]: I0227 00:11:23.092493 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 27 00:11:23 crc kubenswrapper[4781]: I0227 00:11:23.597572 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 27 00:11:23 crc kubenswrapper[4781]: I0227 00:11:23.886067 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 27 00:11:24 crc kubenswrapper[4781]: I0227 00:11:24.563962 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 27 00:11:25 crc kubenswrapper[4781]: I0227 00:11:25.600697 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 00:11:28 crc kubenswrapper[4781]: I0227 00:11:28.802422 4781 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 27 00:11:28 crc kubenswrapper[4781]: I0227 00:11:28.804657 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae" gracePeriod=5 Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.389124 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.389186 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.405094 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.405132 4781 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae" exitCode=137 Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.405170 4781 scope.go:117] "RemoveContainer" containerID="2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.405270 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.432429 4781 scope.go:117] "RemoveContainer" containerID="2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae" Feb 27 00:11:34 crc kubenswrapper[4781]: E0227 00:11:34.433077 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae\": container with ID starting with 2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae not found: ID does not exist" containerID="2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.433139 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae"} err="failed to get container status \"2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae\": rpc error: code = NotFound desc = could not find container \"2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae\": container with ID starting with 2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae not found: ID does not exist" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562374 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562436 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562524 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562578 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562663 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562780 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562895 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.564591 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.565115 4781 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.565146 4781 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.565167 4781 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.565586 4781 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.577027 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.666853 4781 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:35 crc kubenswrapper[4781]: I0227 00:11:35.321022 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 27 00:11:39 crc kubenswrapper[4781]: I0227 00:11:39.441481 4781 generic.go:334] "Generic (PLEG): container finished" podID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerID="ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e" exitCode=0 Feb 27 00:11:39 crc kubenswrapper[4781]: I0227 00:11:39.441569 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" event={"ID":"6dc17f1d-c1f4-43b9-9291-7c32c6804d44","Type":"ContainerDied","Data":"ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e"} Feb 27 00:11:39 crc kubenswrapper[4781]: I0227 00:11:39.443132 4781 scope.go:117] "RemoveContainer" containerID="ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e" Feb 27 00:11:40 crc kubenswrapper[4781]: I0227 00:11:40.449239 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" event={"ID":"6dc17f1d-c1f4-43b9-9291-7c32c6804d44","Type":"ContainerStarted","Data":"d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1"} Feb 27 00:11:40 crc kubenswrapper[4781]: I0227 00:11:40.450315 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:11:40 crc kubenswrapper[4781]: I0227 00:11:40.452418 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.094299 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kqrgb"] Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.095319 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kqrgb" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="registry-server" containerID="cri-o://ec7472b1d4abe3539fd2b9c6a74552c975f1e7a845d80d7f3684a0e55a838de1" gracePeriod=2 Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.556373 4781 generic.go:334] "Generic (PLEG): container finished" podID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerID="ec7472b1d4abe3539fd2b9c6a74552c975f1e7a845d80d7f3684a0e55a838de1" exitCode=0 Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.556493 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqrgb" event={"ID":"ac30245d-7e42-440c-99a0-60e2ae15cb8b","Type":"ContainerDied","Data":"ec7472b1d4abe3539fd2b9c6a74552c975f1e7a845d80d7f3684a0e55a838de1"} Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.556792 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqrgb" event={"ID":"ac30245d-7e42-440c-99a0-60e2ae15cb8b","Type":"ContainerDied","Data":"ba66da6dc8bfa69982da2943397bfec42cd942427662c0a4732f24accf5f77a6"} Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.556817 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba66da6dc8bfa69982da2943397bfec42cd942427662c0a4732f24accf5f77a6" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.557044 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.701212 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tk7f\" (UniqueName: \"kubernetes.io/projected/ac30245d-7e42-440c-99a0-60e2ae15cb8b-kube-api-access-8tk7f\") pod \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.701398 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-catalog-content\") pod \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.701443 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-utilities\") pod \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.703042 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-utilities" (OuterVolumeSpecName: "utilities") pod "ac30245d-7e42-440c-99a0-60e2ae15cb8b" (UID: "ac30245d-7e42-440c-99a0-60e2ae15cb8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.709547 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac30245d-7e42-440c-99a0-60e2ae15cb8b-kube-api-access-8tk7f" (OuterVolumeSpecName: "kube-api-access-8tk7f") pod "ac30245d-7e42-440c-99a0-60e2ae15cb8b" (UID: "ac30245d-7e42-440c-99a0-60e2ae15cb8b"). InnerVolumeSpecName "kube-api-access-8tk7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.777571 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac30245d-7e42-440c-99a0-60e2ae15cb8b" (UID: "ac30245d-7e42-440c-99a0-60e2ae15cb8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.802577 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tk7f\" (UniqueName: \"kubernetes.io/projected/ac30245d-7e42-440c-99a0-60e2ae15cb8b-kube-api-access-8tk7f\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.802604 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.802615 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:59 crc kubenswrapper[4781]: I0227 00:11:59.563534 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:11:59 crc kubenswrapper[4781]: I0227 00:11:59.589096 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kqrgb"] Feb 27 00:11:59 crc kubenswrapper[4781]: I0227 00:11:59.597510 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kqrgb"] Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.165476 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535852-49dfn"] Feb 27 00:12:00 crc kubenswrapper[4781]: E0227 00:12:00.166224 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="registry-server" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.166261 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="registry-server" Feb 27 00:12:00 crc kubenswrapper[4781]: E0227 00:12:00.166299 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="extract-utilities" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.166315 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="extract-utilities" Feb 27 00:12:00 crc kubenswrapper[4781]: E0227 00:12:00.166337 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="extract-content" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.166351 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="extract-content" Feb 27 00:12:00 crc kubenswrapper[4781]: E0227 00:12:00.166375 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.166389 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.166593 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="registry-server" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.166620 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.167588 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.174509 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.174512 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535852-49dfn"] Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.174694 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.176013 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.345360 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gq6k\" (UniqueName: \"kubernetes.io/projected/96ecbd6e-c579-40ca-a5bf-9876777721f9-kube-api-access-8gq6k\") pod \"auto-csr-approver-29535852-49dfn\" (UID: \"96ecbd6e-c579-40ca-a5bf-9876777721f9\") " pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.446493 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gq6k\" (UniqueName: \"kubernetes.io/projected/96ecbd6e-c579-40ca-a5bf-9876777721f9-kube-api-access-8gq6k\") pod \"auto-csr-approver-29535852-49dfn\" (UID: \"96ecbd6e-c579-40ca-a5bf-9876777721f9\") " pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.469379 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gq6k\" (UniqueName: \"kubernetes.io/projected/96ecbd6e-c579-40ca-a5bf-9876777721f9-kube-api-access-8gq6k\") pod \"auto-csr-approver-29535852-49dfn\" (UID: \"96ecbd6e-c579-40ca-a5bf-9876777721f9\") " pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.486043 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.921490 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535852-49dfn"] Feb 27 00:12:01 crc kubenswrapper[4781]: I0227 00:12:01.317740 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" path="/var/lib/kubelet/pods/ac30245d-7e42-440c-99a0-60e2ae15cb8b/volumes" Feb 27 00:12:01 crc kubenswrapper[4781]: I0227 00:12:01.581084 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535852-49dfn" event={"ID":"96ecbd6e-c579-40ca-a5bf-9876777721f9","Type":"ContainerStarted","Data":"8d9ee590d475e3013eb4d1b9376a1cb32e7ad41ac823cc3a34078edf0741d292"} Feb 27 00:12:02 crc kubenswrapper[4781]: I0227 00:12:02.587433 4781 generic.go:334] "Generic (PLEG): container finished" podID="96ecbd6e-c579-40ca-a5bf-9876777721f9" containerID="5a1ffc2079241a21de7cc919695abf3baba7e2af15f91ad7d2c4786574ddb8a4" exitCode=0 Feb 27 00:12:02 crc kubenswrapper[4781]: I0227 00:12:02.587811 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535852-49dfn" event={"ID":"96ecbd6e-c579-40ca-a5bf-9876777721f9","Type":"ContainerDied","Data":"5a1ffc2079241a21de7cc919695abf3baba7e2af15f91ad7d2c4786574ddb8a4"} Feb 27 00:12:03 crc kubenswrapper[4781]: I0227 00:12:03.994463 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:04 crc kubenswrapper[4781]: I0227 00:12:04.097212 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gq6k\" (UniqueName: \"kubernetes.io/projected/96ecbd6e-c579-40ca-a5bf-9876777721f9-kube-api-access-8gq6k\") pod \"96ecbd6e-c579-40ca-a5bf-9876777721f9\" (UID: \"96ecbd6e-c579-40ca-a5bf-9876777721f9\") " Feb 27 00:12:04 crc kubenswrapper[4781]: I0227 00:12:04.118067 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ecbd6e-c579-40ca-a5bf-9876777721f9-kube-api-access-8gq6k" (OuterVolumeSpecName: "kube-api-access-8gq6k") pod "96ecbd6e-c579-40ca-a5bf-9876777721f9" (UID: "96ecbd6e-c579-40ca-a5bf-9876777721f9"). InnerVolumeSpecName "kube-api-access-8gq6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:12:04 crc kubenswrapper[4781]: I0227 00:12:04.199157 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gq6k\" (UniqueName: \"kubernetes.io/projected/96ecbd6e-c579-40ca-a5bf-9876777721f9-kube-api-access-8gq6k\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:04 crc kubenswrapper[4781]: I0227 00:12:04.620326 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535852-49dfn" event={"ID":"96ecbd6e-c579-40ca-a5bf-9876777721f9","Type":"ContainerDied","Data":"8d9ee590d475e3013eb4d1b9376a1cb32e7ad41ac823cc3a34078edf0741d292"} Feb 27 00:12:04 crc kubenswrapper[4781]: I0227 00:12:04.621080 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d9ee590d475e3013eb4d1b9376a1cb32e7ad41ac823cc3a34078edf0741d292" Feb 27 00:12:04 crc kubenswrapper[4781]: I0227 00:12:04.621235 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.083179 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kztqg"] Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.083970 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kztqg" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="registry-server" containerID="cri-o://efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a" gracePeriod=30 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.099832 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-42hbx"] Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.100268 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-42hbx" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="registry-server" containerID="cri-o://22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce" gracePeriod=30 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.115957 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgpv7"] Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.116291 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" containerID="cri-o://d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1" gracePeriod=30 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.134074 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ngbg"] Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.134338 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9ngbg" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="registry-server" containerID="cri-o://282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5" gracePeriod=30 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.147701 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hcdz5"] Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.147964 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hcdz5" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="registry-server" containerID="cri-o://170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3" gracePeriod=30 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.151236 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h5lrz"] Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.151531 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ecbd6e-c579-40ca-a5bf-9876777721f9" containerName="oc" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.151549 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ecbd6e-c579-40ca-a5bf-9876777721f9" containerName="oc" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.151675 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ecbd6e-c579-40ca-a5bf-9876777721f9" containerName="oc" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.152019 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.154715 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h5lrz"] Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.184647 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/672e121e-2b7f-4454-b628-d99032669167-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.184818 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/672e121e-2b7f-4454-b628-d99032669167-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.184886 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxd2b\" (UniqueName: \"kubernetes.io/projected/672e121e-2b7f-4454-b628-d99032669167-kube-api-access-bxd2b\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.285892 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/672e121e-2b7f-4454-b628-d99032669167-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.285947 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxd2b\" (UniqueName: \"kubernetes.io/projected/672e121e-2b7f-4454-b628-d99032669167-kube-api-access-bxd2b\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.286018 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/672e121e-2b7f-4454-b628-d99032669167-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.287173 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/672e121e-2b7f-4454-b628-d99032669167-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.291182 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/672e121e-2b7f-4454-b628-d99032669167-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.321583 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxd2b\" (UniqueName: \"kubernetes.io/projected/672e121e-2b7f-4454-b628-d99032669167-kube-api-access-bxd2b\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.493783 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.500507 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.615477 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.625016 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.661114 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.679535 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.690989 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-catalog-content\") pod \"19ed5401-2778-4266-8bf1-1c7244dac100\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.691061 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-utilities\") pod \"19ed5401-2778-4266-8bf1-1c7244dac100\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.691146 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgndh\" (UniqueName: \"kubernetes.io/projected/19ed5401-2778-4266-8bf1-1c7244dac100-kube-api-access-xgndh\") pod \"19ed5401-2778-4266-8bf1-1c7244dac100\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.693727 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-utilities" (OuterVolumeSpecName: "utilities") pod "19ed5401-2778-4266-8bf1-1c7244dac100" (UID: "19ed5401-2778-4266-8bf1-1c7244dac100"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.697546 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ed5401-2778-4266-8bf1-1c7244dac100-kube-api-access-xgndh" (OuterVolumeSpecName: "kube-api-access-xgndh") pod "19ed5401-2778-4266-8bf1-1c7244dac100" (UID: "19ed5401-2778-4266-8bf1-1c7244dac100"). InnerVolumeSpecName "kube-api-access-xgndh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.709221 4781 generic.go:334] "Generic (PLEG): container finished" podID="19ed5401-2778-4266-8bf1-1c7244dac100" containerID="22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce" exitCode=0 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.709279 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42hbx" event={"ID":"19ed5401-2778-4266-8bf1-1c7244dac100","Type":"ContainerDied","Data":"22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.709306 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42hbx" event={"ID":"19ed5401-2778-4266-8bf1-1c7244dac100","Type":"ContainerDied","Data":"78b3df3f6b7f7425a9c2cd10f5b420e9f36ecb616bd533d5cfdfee3767475ccc"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.709322 4781 scope.go:117] "RemoveContainer" containerID="22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.709426 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.715410 4781 generic.go:334] "Generic (PLEG): container finished" podID="baa593f3-06c4-461f-a893-609b07dfd282" containerID="282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5" exitCode=0 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.715496 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ngbg" event={"ID":"baa593f3-06c4-461f-a893-609b07dfd282","Type":"ContainerDied","Data":"282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.715529 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ngbg" event={"ID":"baa593f3-06c4-461f-a893-609b07dfd282","Type":"ContainerDied","Data":"9502c5ad99503e1096d0070d626245f0844a912a2ffc6a125931ca6764817da5"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.715557 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.718136 4781 generic.go:334] "Generic (PLEG): container finished" podID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerID="d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1" exitCode=0 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.718321 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.718619 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" event={"ID":"6dc17f1d-c1f4-43b9-9291-7c32c6804d44","Type":"ContainerDied","Data":"d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.718659 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" event={"ID":"6dc17f1d-c1f4-43b9-9291-7c32c6804d44","Type":"ContainerDied","Data":"eb45173a1f629c7ad2883098f5964e4563b43bb7bdca30eb6fc3bc6e2ce93911"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.724281 4781 generic.go:334] "Generic (PLEG): container finished" podID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerID="170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3" exitCode=0 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.724346 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerDied","Data":"170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.724372 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerDied","Data":"6d980a6fc9de180882f2ee8cc193af0d7ab5d1ba875bfb8da4f55cc14f767f69"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.724430 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.729738 4781 generic.go:334] "Generic (PLEG): container finished" podID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerID="efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a" exitCode=0 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.729783 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kztqg" event={"ID":"2b050e9e-d6c8-4e27-ad3f-9681553c1539","Type":"ContainerDied","Data":"efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.729810 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kztqg" event={"ID":"2b050e9e-d6c8-4e27-ad3f-9681553c1539","Type":"ContainerDied","Data":"f682c737bcb211243a2988ca17e566ea00c7e2d14bf78fba6f612945a62f66e6"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.729834 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.733475 4781 scope.go:117] "RemoveContainer" containerID="f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.761154 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19ed5401-2778-4266-8bf1-1c7244dac100" (UID: "19ed5401-2778-4266-8bf1-1c7244dac100"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.763882 4781 scope.go:117] "RemoveContainer" containerID="064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.787546 4781 scope.go:117] "RemoveContainer" containerID="22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.788181 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce\": container with ID starting with 22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce not found: ID does not exist" containerID="22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.788227 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce"} err="failed to get container status \"22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce\": rpc error: code = NotFound desc = could not find container \"22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce\": container with ID starting with 22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.788257 4781 scope.go:117] "RemoveContainer" containerID="f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.788531 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c\": container with ID starting with f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c not found: ID does not exist" containerID="f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.788554 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c"} err="failed to get container status \"f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c\": rpc error: code = NotFound desc = could not find container \"f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c\": container with ID starting with f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.788569 4781 scope.go:117] "RemoveContainer" containerID="064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.788915 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a\": container with ID starting with 064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a not found: ID does not exist" containerID="064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.788934 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a"} err="failed to get container status \"064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a\": rpc error: code = NotFound desc = could not find container \"064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a\": container with ID starting with 064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.788947 4781 scope.go:117] "RemoveContainer" containerID="282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793421 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-utilities\") pod \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793579 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-utilities\") pod \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793616 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-catalog-content\") pod \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793670 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztvqm\" (UniqueName: \"kubernetes.io/projected/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-kube-api-access-ztvqm\") pod \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793727 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-utilities\") pod \"baa593f3-06c4-461f-a893-609b07dfd282\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793782 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-catalog-content\") pod \"baa593f3-06c4-461f-a893-609b07dfd282\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793836 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-catalog-content\") pod \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793853 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw4cs\" (UniqueName: \"kubernetes.io/projected/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-kube-api-access-mw4cs\") pod \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793934 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-operator-metrics\") pod \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793974 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mqv2\" (UniqueName: \"kubernetes.io/projected/baa593f3-06c4-461f-a893-609b07dfd282-kube-api-access-9mqv2\") pod \"baa593f3-06c4-461f-a893-609b07dfd282\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793994 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-trusted-ca\") pod \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.794017 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpnxw\" (UniqueName: \"kubernetes.io/projected/2b050e9e-d6c8-4e27-ad3f-9681553c1539-kube-api-access-zpnxw\") pod \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.794179 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-utilities" (OuterVolumeSpecName: "utilities") pod "2b050e9e-d6c8-4e27-ad3f-9681553c1539" (UID: "2b050e9e-d6c8-4e27-ad3f-9681553c1539"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.794326 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.794482 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-utilities" (OuterVolumeSpecName: "utilities") pod "a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" (UID: "a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.795021 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgndh\" (UniqueName: \"kubernetes.io/projected/19ed5401-2778-4266-8bf1-1c7244dac100-kube-api-access-xgndh\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.795052 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.795061 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.795153 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6dc17f1d-c1f4-43b9-9291-7c32c6804d44" (UID: "6dc17f1d-c1f4-43b9-9291-7c32c6804d44"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.795852 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-utilities" (OuterVolumeSpecName: "utilities") pod "baa593f3-06c4-461f-a893-609b07dfd282" (UID: "baa593f3-06c4-461f-a893-609b07dfd282"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.797713 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b050e9e-d6c8-4e27-ad3f-9681553c1539-kube-api-access-zpnxw" (OuterVolumeSpecName: "kube-api-access-zpnxw") pod "2b050e9e-d6c8-4e27-ad3f-9681553c1539" (UID: "2b050e9e-d6c8-4e27-ad3f-9681553c1539"). InnerVolumeSpecName "kube-api-access-zpnxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.799759 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa593f3-06c4-461f-a893-609b07dfd282-kube-api-access-9mqv2" (OuterVolumeSpecName: "kube-api-access-9mqv2") pod "baa593f3-06c4-461f-a893-609b07dfd282" (UID: "baa593f3-06c4-461f-a893-609b07dfd282"). InnerVolumeSpecName "kube-api-access-9mqv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.805409 4781 scope.go:117] "RemoveContainer" containerID="c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.805689 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-kube-api-access-mw4cs" (OuterVolumeSpecName: "kube-api-access-mw4cs") pod "6dc17f1d-c1f4-43b9-9291-7c32c6804d44" (UID: "6dc17f1d-c1f4-43b9-9291-7c32c6804d44"). InnerVolumeSpecName "kube-api-access-mw4cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.808776 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-kube-api-access-ztvqm" (OuterVolumeSpecName: "kube-api-access-ztvqm") pod "a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" (UID: "a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0"). InnerVolumeSpecName "kube-api-access-ztvqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.809022 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6dc17f1d-c1f4-43b9-9291-7c32c6804d44" (UID: "6dc17f1d-c1f4-43b9-9291-7c32c6804d44"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.821373 4781 scope.go:117] "RemoveContainer" containerID="eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.828066 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "baa593f3-06c4-461f-a893-609b07dfd282" (UID: "baa593f3-06c4-461f-a893-609b07dfd282"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.832431 4781 scope.go:117] "RemoveContainer" containerID="282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.833929 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5\": container with ID starting with 282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5 not found: ID does not exist" containerID="282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.834018 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5"} err="failed to get container status \"282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5\": rpc error: code = NotFound desc = could not find container \"282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5\": container with ID starting with 282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5 not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.834063 4781 scope.go:117] "RemoveContainer" containerID="c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.834610 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f\": container with ID starting with c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f not found: ID does not exist" containerID="c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.834656 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f"} err="failed to get container status \"c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f\": rpc error: code = NotFound desc = could not find container \"c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f\": container with ID starting with c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.834681 4781 scope.go:117] "RemoveContainer" containerID="eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.835099 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91\": container with ID starting with eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91 not found: ID does not exist" containerID="eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.835130 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91"} err="failed to get container status \"eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91\": rpc error: code = NotFound desc = could not find container \"eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91\": container with ID starting with eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91 not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.835151 4781 scope.go:117] "RemoveContainer" containerID="d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.849927 4781 scope.go:117] "RemoveContainer" containerID="ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.859684 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b050e9e-d6c8-4e27-ad3f-9681553c1539" (UID: "2b050e9e-d6c8-4e27-ad3f-9681553c1539"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.864313 4781 scope.go:117] "RemoveContainer" containerID="d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.865257 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1\": container with ID starting with d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1 not found: ID does not exist" containerID="d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.865297 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1"} err="failed to get container status \"d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1\": rpc error: code = NotFound desc = could not find container \"d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1\": container with ID starting with d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1 not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.865346 4781 scope.go:117] "RemoveContainer" containerID="ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.866457 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e\": container with ID starting with ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e not found: ID does not exist" containerID="ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.866576 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e"} err="failed to get container status \"ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e\": rpc error: code = NotFound desc = could not find container \"ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e\": container with ID starting with ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.866662 4781 scope.go:117] "RemoveContainer" containerID="170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.880317 4781 scope.go:117] "RemoveContainer" containerID="a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896338 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztvqm\" (UniqueName: \"kubernetes.io/projected/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-kube-api-access-ztvqm\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896383 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896399 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896412 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896427 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw4cs\" (UniqueName: \"kubernetes.io/projected/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-kube-api-access-mw4cs\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896440 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896454 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mqv2\" (UniqueName: \"kubernetes.io/projected/baa593f3-06c4-461f-a893-609b07dfd282-kube-api-access-9mqv2\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896467 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896479 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpnxw\" (UniqueName: \"kubernetes.io/projected/2b050e9e-d6c8-4e27-ad3f-9681553c1539-kube-api-access-zpnxw\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896490 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.898828 4781 scope.go:117] "RemoveContainer" containerID="a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.909936 4781 scope.go:117] "RemoveContainer" containerID="170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.910327 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3\": container with ID starting with 170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3 not found: ID does not exist" containerID="170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.910368 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3"} err="failed to get container status \"170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3\": rpc error: code = NotFound desc = could not find container \"170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3\": container with ID starting with 170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3 not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.910395 4781 scope.go:117] "RemoveContainer" containerID="a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.910852 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093\": container with ID starting with a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093 not found: ID does not exist" containerID="a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.910901 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093"} err="failed to get container status \"a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093\": rpc error: code = NotFound desc = could not find container \"a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093\": container with ID starting with a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093 not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.910935 4781 scope.go:117] "RemoveContainer" containerID="a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.911241 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a\": container with ID starting with a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a not found: ID does not exist" containerID="a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.911269 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a"} err="failed to get container status \"a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a\": rpc error: code = NotFound desc = could not find container \"a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a\": container with ID starting with a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.911286 4781 scope.go:117] "RemoveContainer" containerID="efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.923790 4781 scope.go:117] "RemoveContainer" containerID="254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.936741 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" (UID: "a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.949686 4781 scope.go:117] "RemoveContainer" containerID="d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.967214 4781 scope.go:117] "RemoveContainer" containerID="efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.967581 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a\": container with ID starting with efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a not found: ID does not exist" containerID="efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.967615 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a"} err="failed to get container status \"efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a\": rpc error: code = NotFound desc = could not find container \"efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a\": container with ID starting with efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.967650 4781 scope.go:117] "RemoveContainer" containerID="254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.967668 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h5lrz"] Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.967942 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8\": container with ID starting with 254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8 not found: ID does not exist" containerID="254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.967968 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8"} err="failed to get container status \"254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8\": rpc error: code = NotFound desc = could not find container \"254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8\": container with ID starting with 254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8 not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.967984 4781 scope.go:117] "RemoveContainer" containerID="d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.968269 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd\": container with ID starting with d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd not found: ID does not exist" containerID="d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.968291 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd"} err="failed to get container status \"d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd\": rpc error: code = NotFound desc = could not find container \"d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd\": container with ID starting with d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.997473 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.042265 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-42hbx"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.047141 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-42hbx"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.067599 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ngbg"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.072802 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ngbg"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.081650 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hcdz5"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.086688 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hcdz5"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.102196 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kztqg"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.106052 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kztqg"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.108807 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgpv7"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.112053 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgpv7"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.315720 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" path="/var/lib/kubelet/pods/19ed5401-2778-4266-8bf1-1c7244dac100/volumes" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.316619 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" path="/var/lib/kubelet/pods/2b050e9e-d6c8-4e27-ad3f-9681553c1539/volumes" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.317370 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" path="/var/lib/kubelet/pods/6dc17f1d-c1f4-43b9-9291-7c32c6804d44/volumes" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.318513 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" path="/var/lib/kubelet/pods/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0/volumes" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.319210 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa593f3-06c4-461f-a893-609b07dfd282" path="/var/lib/kubelet/pods/baa593f3-06c4-461f-a893-609b07dfd282/volumes" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.737560 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" event={"ID":"672e121e-2b7f-4454-b628-d99032669167","Type":"ContainerStarted","Data":"0ae642073089998589b3411622ec45694b1a13d8c7760ac9204b57878538de54"} Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.737614 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" event={"ID":"672e121e-2b7f-4454-b628-d99032669167","Type":"ContainerStarted","Data":"6aba3392a6bb9f45d6db84c5245963a2d13b5bb8600834bc595b81ced0fcb847"} Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.737851 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.743587 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.757541 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" podStartSLOduration=1.757516319 podStartE2EDuration="1.757516319s" podCreationTimestamp="2026-02-27 00:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:12:17.756914723 +0000 UTC m=+407.014454277" watchObservedRunningTime="2026-02-27 00:12:17.757516319 +0000 UTC m=+407.015055903" Feb 27 00:12:42 crc kubenswrapper[4781]: I0227 00:12:42.894982 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:12:42 crc kubenswrapper[4781]: I0227 00:12:42.895712 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.096611 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kdd5s"] Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097032 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097044 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097054 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097059 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097067 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097073 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097086 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097092 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097099 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097104 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097113 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097119 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097126 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097132 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097142 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097148 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097154 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097159 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097168 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097173 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097185 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097190 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097198 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097203 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097212 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097217 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097291 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097301 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097312 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097319 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097327 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097334 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097694 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.109087 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kdd5s"] Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.211888 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhsgl\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-kube-api-access-xhsgl\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.211931 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9214145c-17df-4f6a-9d5d-fa488256bf24-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.211955 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.212001 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9214145c-17df-4f6a-9d5d-fa488256bf24-registry-certificates\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.212037 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-bound-sa-token\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.212056 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9214145c-17df-4f6a-9d5d-fa488256bf24-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.212081 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9214145c-17df-4f6a-9d5d-fa488256bf24-trusted-ca\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.212105 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-registry-tls\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.231029 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.313567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-bound-sa-token\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.313649 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9214145c-17df-4f6a-9d5d-fa488256bf24-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.313689 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9214145c-17df-4f6a-9d5d-fa488256bf24-trusted-ca\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.313713 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-registry-tls\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.313959 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhsgl\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-kube-api-access-xhsgl\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.314404 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9214145c-17df-4f6a-9d5d-fa488256bf24-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.314735 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9214145c-17df-4f6a-9d5d-fa488256bf24-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.314879 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9214145c-17df-4f6a-9d5d-fa488256bf24-registry-certificates\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.315022 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9214145c-17df-4f6a-9d5d-fa488256bf24-trusted-ca\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.316171 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9214145c-17df-4f6a-9d5d-fa488256bf24-registry-certificates\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.319734 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9214145c-17df-4f6a-9d5d-fa488256bf24-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.327873 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-registry-tls\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.332484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-bound-sa-token\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.337051 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhsgl\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-kube-api-access-xhsgl\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.413984 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.646260 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kdd5s"] Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.927731 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" event={"ID":"9214145c-17df-4f6a-9d5d-fa488256bf24","Type":"ContainerStarted","Data":"b106ff048bebda9206279c84d9b98418e2ff640bd6d31039c45d27053e3ab869"} Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.927783 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" event={"ID":"9214145c-17df-4f6a-9d5d-fa488256bf24","Type":"ContainerStarted","Data":"10a05872b968a5c227876b0f31b8cab287f5dfa365e8e2186ce3887a6f9f2774"} Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.927945 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.951801 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" podStartSLOduration=0.951783284 podStartE2EDuration="951.783284ms" podCreationTimestamp="2026-02-27 00:12:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:12:46.949060493 +0000 UTC m=+436.206600037" watchObservedRunningTime="2026-02-27 00:12:46.951783284 +0000 UTC m=+436.209322848" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.114399 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sp6x7"] Feb 27 00:12:47 crc kubenswrapper[4781]: E0227 00:12:47.115035 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.115051 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.118531 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.120514 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.122877 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sp6x7"] Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.228061 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9df096-6538-4b50-8536-bfdd5474eece-utilities\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.228377 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2t2x\" (UniqueName: \"kubernetes.io/projected/dc9df096-6538-4b50-8536-bfdd5474eece-kube-api-access-r2t2x\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.228548 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9df096-6538-4b50-8536-bfdd5474eece-catalog-content\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.314871 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9sn"] Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.315989 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.318321 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.327220 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9sn"] Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.329490 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9df096-6538-4b50-8536-bfdd5474eece-catalog-content\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.330051 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9df096-6538-4b50-8536-bfdd5474eece-catalog-content\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.330743 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9df096-6538-4b50-8536-bfdd5474eece-utilities\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.330784 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2t2x\" (UniqueName: \"kubernetes.io/projected/dc9df096-6538-4b50-8536-bfdd5474eece-kube-api-access-r2t2x\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.331033 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9df096-6538-4b50-8536-bfdd5474eece-utilities\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.354503 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2t2x\" (UniqueName: \"kubernetes.io/projected/dc9df096-6538-4b50-8536-bfdd5474eece-kube-api-access-r2t2x\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.432038 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0f47-560e-4d1a-8414-b65b1a159c68-utilities\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.432146 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0f47-560e-4d1a-8414-b65b1a159c68-catalog-content\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.432242 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rvw2\" (UniqueName: \"kubernetes.io/projected/1b6e0f47-560e-4d1a-8414-b65b1a159c68-kube-api-access-8rvw2\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.436921 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.534452 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rvw2\" (UniqueName: \"kubernetes.io/projected/1b6e0f47-560e-4d1a-8414-b65b1a159c68-kube-api-access-8rvw2\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.534805 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0f47-560e-4d1a-8414-b65b1a159c68-utilities\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.534846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0f47-560e-4d1a-8414-b65b1a159c68-catalog-content\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.535549 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0f47-560e-4d1a-8414-b65b1a159c68-utilities\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.536466 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0f47-560e-4d1a-8414-b65b1a159c68-catalog-content\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.557685 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rvw2\" (UniqueName: \"kubernetes.io/projected/1b6e0f47-560e-4d1a-8414-b65b1a159c68-kube-api-access-8rvw2\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.649364 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.840473 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sp6x7"] Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.855177 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9sn"] Feb 27 00:12:47 crc kubenswrapper[4781]: W0227 00:12:47.860387 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b6e0f47_560e_4d1a_8414_b65b1a159c68.slice/crio-be41cdab5afb116c2dd348da76c4c90ffdc4f2598752927b7cd53ef460842451 WatchSource:0}: Error finding container be41cdab5afb116c2dd348da76c4c90ffdc4f2598752927b7cd53ef460842451: Status 404 returned error can't find the container with id be41cdab5afb116c2dd348da76c4c90ffdc4f2598752927b7cd53ef460842451 Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.934696 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp6x7" event={"ID":"dc9df096-6538-4b50-8536-bfdd5474eece","Type":"ContainerStarted","Data":"0e534f4895fd12f31a5892b669b3034b19ea81c3dce0ed8633a17ea5fbab9974"} Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.935729 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9sn" event={"ID":"1b6e0f47-560e-4d1a-8414-b65b1a159c68","Type":"ContainerStarted","Data":"be41cdab5afb116c2dd348da76c4c90ffdc4f2598752927b7cd53ef460842451"} Feb 27 00:12:48 crc kubenswrapper[4781]: I0227 00:12:48.941917 4781 generic.go:334] "Generic (PLEG): container finished" podID="1b6e0f47-560e-4d1a-8414-b65b1a159c68" containerID="ec58bc6bc7b876505ea1296e1abc01cb884fc82e5f593fe07e8246a1dd8a35cf" exitCode=0 Feb 27 00:12:48 crc kubenswrapper[4781]: I0227 00:12:48.942087 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9sn" event={"ID":"1b6e0f47-560e-4d1a-8414-b65b1a159c68","Type":"ContainerDied","Data":"ec58bc6bc7b876505ea1296e1abc01cb884fc82e5f593fe07e8246a1dd8a35cf"} Feb 27 00:12:48 crc kubenswrapper[4781]: I0227 00:12:48.944382 4781 generic.go:334] "Generic (PLEG): container finished" podID="dc9df096-6538-4b50-8536-bfdd5474eece" containerID="cf9515ff8da586abfc861e302536bce5f9ffd87a480960cd1dba5923515e72f5" exitCode=0 Feb 27 00:12:48 crc kubenswrapper[4781]: I0227 00:12:48.944409 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp6x7" event={"ID":"dc9df096-6538-4b50-8536-bfdd5474eece","Type":"ContainerDied","Data":"cf9515ff8da586abfc861e302536bce5f9ffd87a480960cd1dba5923515e72f5"} Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.512459 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pjpww"] Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.515483 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.517841 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.526341 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pjpww"] Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.575369 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47g7t\" (UniqueName: \"kubernetes.io/projected/5ef2a1c8-c174-456d-adff-2693b022fa83-kube-api-access-47g7t\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.575451 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ef2a1c8-c174-456d-adff-2693b022fa83-utilities\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.575490 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ef2a1c8-c174-456d-adff-2693b022fa83-catalog-content\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.677044 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ef2a1c8-c174-456d-adff-2693b022fa83-catalog-content\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.677151 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47g7t\" (UniqueName: \"kubernetes.io/projected/5ef2a1c8-c174-456d-adff-2693b022fa83-kube-api-access-47g7t\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.677206 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ef2a1c8-c174-456d-adff-2693b022fa83-utilities\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.677965 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ef2a1c8-c174-456d-adff-2693b022fa83-catalog-content\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.678067 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ef2a1c8-c174-456d-adff-2693b022fa83-utilities\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.696524 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47g7t\" (UniqueName: \"kubernetes.io/projected/5ef2a1c8-c174-456d-adff-2693b022fa83-kube-api-access-47g7t\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.709969 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kpswm"] Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.711309 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.713529 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.720857 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kpswm"] Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.778042 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-catalog-content\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.778083 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-utilities\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.778112 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwc74\" (UniqueName: \"kubernetes.io/projected/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-kube-api-access-wwc74\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.838501 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.879287 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-catalog-content\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.879347 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-utilities\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.879394 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwc74\" (UniqueName: \"kubernetes.io/projected/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-kube-api-access-wwc74\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.880009 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-utilities\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.882453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-catalog-content\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.899445 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwc74\" (UniqueName: \"kubernetes.io/projected/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-kube-api-access-wwc74\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.955646 4781 generic.go:334] "Generic (PLEG): container finished" podID="1b6e0f47-560e-4d1a-8414-b65b1a159c68" containerID="bc56036947f389c30da6f4d6cf65a2a2aa23b50e0d86c6420c1f6da94739bf2c" exitCode=0 Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.955938 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9sn" event={"ID":"1b6e0f47-560e-4d1a-8414-b65b1a159c68","Type":"ContainerDied","Data":"bc56036947f389c30da6f4d6cf65a2a2aa23b50e0d86c6420c1f6da94739bf2c"} Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.959465 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp6x7" event={"ID":"dc9df096-6538-4b50-8536-bfdd5474eece","Type":"ContainerStarted","Data":"c7fead24172fe5cc2930ab204e9faa584280644447a9f1c3406e8bebb16e2a9c"} Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.066443 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pjpww"] Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.068785 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:50 crc kubenswrapper[4781]: W0227 00:12:50.078006 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ef2a1c8_c174_456d_adff_2693b022fa83.slice/crio-1339fbb9ae8ae1924a262446b32a911cfb2db8083af9a1a8e3ffcecc9410f70e WatchSource:0}: Error finding container 1339fbb9ae8ae1924a262446b32a911cfb2db8083af9a1a8e3ffcecc9410f70e: Status 404 returned error can't find the container with id 1339fbb9ae8ae1924a262446b32a911cfb2db8083af9a1a8e3ffcecc9410f70e Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.882499 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kpswm"] Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.966562 4781 generic.go:334] "Generic (PLEG): container finished" podID="dc9df096-6538-4b50-8536-bfdd5474eece" containerID="c7fead24172fe5cc2930ab204e9faa584280644447a9f1c3406e8bebb16e2a9c" exitCode=0 Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.966778 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp6x7" event={"ID":"dc9df096-6538-4b50-8536-bfdd5474eece","Type":"ContainerDied","Data":"c7fead24172fe5cc2930ab204e9faa584280644447a9f1c3406e8bebb16e2a9c"} Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.977402 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9sn" event={"ID":"1b6e0f47-560e-4d1a-8414-b65b1a159c68","Type":"ContainerStarted","Data":"56a11ee178e49c4a68132eea8bbfe49daa76c637a8b8ff7ab08d3049ae4223cf"} Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.978969 4781 generic.go:334] "Generic (PLEG): container finished" podID="5ef2a1c8-c174-456d-adff-2693b022fa83" containerID="a30737fec7fa0dc292ac37010102da44f94592d0de29f83e0d51c4626c4936b5" exitCode=0 Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.979173 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjpww" event={"ID":"5ef2a1c8-c174-456d-adff-2693b022fa83","Type":"ContainerDied","Data":"a30737fec7fa0dc292ac37010102da44f94592d0de29f83e0d51c4626c4936b5"} Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.979481 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjpww" event={"ID":"5ef2a1c8-c174-456d-adff-2693b022fa83","Type":"ContainerStarted","Data":"1339fbb9ae8ae1924a262446b32a911cfb2db8083af9a1a8e3ffcecc9410f70e"} Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.980668 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpswm" event={"ID":"9186313b-02fa-4d6f-9394-ab05a9e3d7d4","Type":"ContainerStarted","Data":"8f604256dd01c9824287221d4e6e219f5bc3aa3b917573f0f5924b28f43596d5"} Feb 27 00:12:51 crc kubenswrapper[4781]: I0227 00:12:51.000838 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sw9sn" podStartSLOduration=2.439722538 podStartE2EDuration="4.000821031s" podCreationTimestamp="2026-02-27 00:12:47 +0000 UTC" firstStartedPulling="2026-02-27 00:12:48.94387004 +0000 UTC m=+438.201409594" lastFinishedPulling="2026-02-27 00:12:50.504968533 +0000 UTC m=+439.762508087" observedRunningTime="2026-02-27 00:12:50.999812075 +0000 UTC m=+440.257351639" watchObservedRunningTime="2026-02-27 00:12:51.000821031 +0000 UTC m=+440.258360595" Feb 27 00:12:51 crc kubenswrapper[4781]: I0227 00:12:51.987445 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp6x7" event={"ID":"dc9df096-6538-4b50-8536-bfdd5474eece","Type":"ContainerStarted","Data":"fd28bbc68e1ee130d63eaad06113a56225923cae8547a7c0754198026c6d7375"} Feb 27 00:12:51 crc kubenswrapper[4781]: I0227 00:12:51.988933 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjpww" event={"ID":"5ef2a1c8-c174-456d-adff-2693b022fa83","Type":"ContainerStarted","Data":"a3624f1dbc87dc2a10a15bacc09efc78fd1d0a601f477d1fadb684f3c3187f26"} Feb 27 00:12:51 crc kubenswrapper[4781]: I0227 00:12:51.990554 4781 generic.go:334] "Generic (PLEG): container finished" podID="9186313b-02fa-4d6f-9394-ab05a9e3d7d4" containerID="b7804c15b27555ed46326a0e5f53b71aa843ba5e64ce55c39d75f8df65107fd4" exitCode=0 Feb 27 00:12:51 crc kubenswrapper[4781]: I0227 00:12:51.990659 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpswm" event={"ID":"9186313b-02fa-4d6f-9394-ab05a9e3d7d4","Type":"ContainerDied","Data":"b7804c15b27555ed46326a0e5f53b71aa843ba5e64ce55c39d75f8df65107fd4"} Feb 27 00:12:52 crc kubenswrapper[4781]: I0227 00:12:52.008018 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sp6x7" podStartSLOduration=2.533475212 podStartE2EDuration="5.008003268s" podCreationTimestamp="2026-02-27 00:12:47 +0000 UTC" firstStartedPulling="2026-02-27 00:12:48.945491252 +0000 UTC m=+438.203030806" lastFinishedPulling="2026-02-27 00:12:51.420019318 +0000 UTC m=+440.677558862" observedRunningTime="2026-02-27 00:12:52.00464226 +0000 UTC m=+441.262181824" watchObservedRunningTime="2026-02-27 00:12:52.008003268 +0000 UTC m=+441.265542812" Feb 27 00:12:52 crc kubenswrapper[4781]: I0227 00:12:52.997306 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpswm" event={"ID":"9186313b-02fa-4d6f-9394-ab05a9e3d7d4","Type":"ContainerStarted","Data":"90585b6b5a7eeaf711facd15133df6ac4b7f831dd2a88a4db8bb5ac6f34036f0"} Feb 27 00:12:52 crc kubenswrapper[4781]: I0227 00:12:52.999416 4781 generic.go:334] "Generic (PLEG): container finished" podID="5ef2a1c8-c174-456d-adff-2693b022fa83" containerID="a3624f1dbc87dc2a10a15bacc09efc78fd1d0a601f477d1fadb684f3c3187f26" exitCode=0 Feb 27 00:12:52 crc kubenswrapper[4781]: I0227 00:12:52.999489 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjpww" event={"ID":"5ef2a1c8-c174-456d-adff-2693b022fa83","Type":"ContainerDied","Data":"a3624f1dbc87dc2a10a15bacc09efc78fd1d0a601f477d1fadb684f3c3187f26"} Feb 27 00:12:54 crc kubenswrapper[4781]: I0227 00:12:54.005491 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjpww" event={"ID":"5ef2a1c8-c174-456d-adff-2693b022fa83","Type":"ContainerStarted","Data":"b14ff0d374bd72b5afdaa3e58794be865fb0037c254f022dfd680b12fd3232b3"} Feb 27 00:12:54 crc kubenswrapper[4781]: I0227 00:12:54.009816 4781 generic.go:334] "Generic (PLEG): container finished" podID="9186313b-02fa-4d6f-9394-ab05a9e3d7d4" containerID="90585b6b5a7eeaf711facd15133df6ac4b7f831dd2a88a4db8bb5ac6f34036f0" exitCode=0 Feb 27 00:12:54 crc kubenswrapper[4781]: I0227 00:12:54.009851 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpswm" event={"ID":"9186313b-02fa-4d6f-9394-ab05a9e3d7d4","Type":"ContainerDied","Data":"90585b6b5a7eeaf711facd15133df6ac4b7f831dd2a88a4db8bb5ac6f34036f0"} Feb 27 00:12:54 crc kubenswrapper[4781]: I0227 00:12:54.027365 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pjpww" podStartSLOduration=2.23990224 podStartE2EDuration="5.027347892s" podCreationTimestamp="2026-02-27 00:12:49 +0000 UTC" firstStartedPulling="2026-02-27 00:12:50.97999696 +0000 UTC m=+440.237536514" lastFinishedPulling="2026-02-27 00:12:53.767442622 +0000 UTC m=+443.024982166" observedRunningTime="2026-02-27 00:12:54.024985971 +0000 UTC m=+443.282525525" watchObservedRunningTime="2026-02-27 00:12:54.027347892 +0000 UTC m=+443.284887456" Feb 27 00:12:55 crc kubenswrapper[4781]: I0227 00:12:55.017991 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpswm" event={"ID":"9186313b-02fa-4d6f-9394-ab05a9e3d7d4","Type":"ContainerStarted","Data":"2d99ae75d86c387a8e30e3e603db79600d116cb08d25683fcb2c54c22a28f8a6"} Feb 27 00:12:55 crc kubenswrapper[4781]: I0227 00:12:55.043051 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kpswm" podStartSLOduration=3.466674029 podStartE2EDuration="6.04303063s" podCreationTimestamp="2026-02-27 00:12:49 +0000 UTC" firstStartedPulling="2026-02-27 00:12:51.991574021 +0000 UTC m=+441.249113575" lastFinishedPulling="2026-02-27 00:12:54.567930612 +0000 UTC m=+443.825470176" observedRunningTime="2026-02-27 00:12:55.037365713 +0000 UTC m=+444.294905277" watchObservedRunningTime="2026-02-27 00:12:55.04303063 +0000 UTC m=+444.300570214" Feb 27 00:12:57 crc kubenswrapper[4781]: I0227 00:12:57.437608 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:57 crc kubenswrapper[4781]: I0227 00:12:57.438869 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:57 crc kubenswrapper[4781]: I0227 00:12:57.504474 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:57 crc kubenswrapper[4781]: I0227 00:12:57.650178 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:57 crc kubenswrapper[4781]: I0227 00:12:57.650737 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:57 crc kubenswrapper[4781]: I0227 00:12:57.693556 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:58 crc kubenswrapper[4781]: I0227 00:12:58.076509 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:58 crc kubenswrapper[4781]: I0227 00:12:58.102702 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:59 crc kubenswrapper[4781]: I0227 00:12:59.839418 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:59 crc kubenswrapper[4781]: I0227 00:12:59.839671 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:59 crc kubenswrapper[4781]: I0227 00:12:59.902773 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:13:00 crc kubenswrapper[4781]: I0227 00:13:00.069532 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:13:00 crc kubenswrapper[4781]: I0227 00:13:00.069581 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:13:00 crc kubenswrapper[4781]: I0227 00:13:00.092609 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:13:00 crc kubenswrapper[4781]: I0227 00:13:00.123702 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:13:01 crc kubenswrapper[4781]: I0227 00:13:01.090870 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:13:06 crc kubenswrapper[4781]: I0227 00:13:06.425600 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:13:06 crc kubenswrapper[4781]: I0227 00:13:06.517757 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tw95c"] Feb 27 00:13:12 crc kubenswrapper[4781]: I0227 00:13:12.895052 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:13:12 crc kubenswrapper[4781]: I0227 00:13:12.895827 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:13:31 crc kubenswrapper[4781]: I0227 00:13:31.569477 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" podUID="16339491-baee-42b5-82bb-07bca82a5f77" containerName="registry" containerID="cri-o://c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077" gracePeriod=30 Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.034048 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.159878 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16339491-baee-42b5-82bb-07bca82a5f77-installation-pull-secrets\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160019 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-registry-tls\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160058 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-trusted-ca\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160091 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16339491-baee-42b5-82bb-07bca82a5f77-ca-trust-extracted\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160125 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-bound-sa-token\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160144 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwd7v\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-kube-api-access-fwd7v\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160289 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160324 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-registry-certificates\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.162089 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.164264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.169853 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16339491-baee-42b5-82bb-07bca82a5f77-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.170147 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.170249 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-kube-api-access-fwd7v" (OuterVolumeSpecName: "kube-api-access-fwd7v") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "kube-api-access-fwd7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.170673 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.170733 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.181338 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16339491-baee-42b5-82bb-07bca82a5f77-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.235201 4781 generic.go:334] "Generic (PLEG): container finished" podID="16339491-baee-42b5-82bb-07bca82a5f77" containerID="c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077" exitCode=0 Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.235248 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" event={"ID":"16339491-baee-42b5-82bb-07bca82a5f77","Type":"ContainerDied","Data":"c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077"} Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.235276 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" event={"ID":"16339491-baee-42b5-82bb-07bca82a5f77","Type":"ContainerDied","Data":"baa2ed7e45a407c61fcadf3b6fb1abb2bf58b2f1863ead5f5bd18f0e92393602"} Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.235275 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.235304 4781 scope.go:117] "RemoveContainer" containerID="c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261843 4781 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261891 4781 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16339491-baee-42b5-82bb-07bca82a5f77-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261908 4781 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261922 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261941 4781 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16339491-baee-42b5-82bb-07bca82a5f77-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261956 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwd7v\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-kube-api-access-fwd7v\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261972 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.270095 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tw95c"] Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.272164 4781 scope.go:117] "RemoveContainer" containerID="c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077" Feb 27 00:13:32 crc kubenswrapper[4781]: E0227 00:13:32.272806 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077\": container with ID starting with c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077 not found: ID does not exist" containerID="c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.272833 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077"} err="failed to get container status \"c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077\": rpc error: code = NotFound desc = could not find container \"c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077\": container with ID starting with c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077 not found: ID does not exist" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.275424 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tw95c"] Feb 27 00:13:33 crc kubenswrapper[4781]: I0227 00:13:33.317343 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16339491-baee-42b5-82bb-07bca82a5f77" path="/var/lib/kubelet/pods/16339491-baee-42b5-82bb-07bca82a5f77/volumes" Feb 27 00:13:42 crc kubenswrapper[4781]: I0227 00:13:42.895872 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:13:42 crc kubenswrapper[4781]: I0227 00:13:42.896534 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:13:42 crc kubenswrapper[4781]: I0227 00:13:42.896596 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:13:42 crc kubenswrapper[4781]: I0227 00:13:42.897285 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a9584e9887d3110a6a6d2ad5c5024fb38c734637c177fd2cbddb2eae4932cdc"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:13:42 crc kubenswrapper[4781]: I0227 00:13:42.897357 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://0a9584e9887d3110a6a6d2ad5c5024fb38c734637c177fd2cbddb2eae4932cdc" gracePeriod=600 Feb 27 00:13:43 crc kubenswrapper[4781]: I0227 00:13:43.311017 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="0a9584e9887d3110a6a6d2ad5c5024fb38c734637c177fd2cbddb2eae4932cdc" exitCode=0 Feb 27 00:13:43 crc kubenswrapper[4781]: I0227 00:13:43.314796 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"0a9584e9887d3110a6a6d2ad5c5024fb38c734637c177fd2cbddb2eae4932cdc"} Feb 27 00:13:43 crc kubenswrapper[4781]: I0227 00:13:43.314838 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"98d9908780c17a21a4a701f7c994bde3e3fbb6ea911f1b4e11c3a27ce7db4d1d"} Feb 27 00:13:43 crc kubenswrapper[4781]: I0227 00:13:43.314855 4781 scope.go:117] "RemoveContainer" containerID="f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.133198 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535854-lplm8"] Feb 27 00:14:00 crc kubenswrapper[4781]: E0227 00:14:00.134001 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16339491-baee-42b5-82bb-07bca82a5f77" containerName="registry" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.134017 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="16339491-baee-42b5-82bb-07bca82a5f77" containerName="registry" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.134146 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="16339491-baee-42b5-82bb-07bca82a5f77" containerName="registry" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.134970 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.136770 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.136806 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.136916 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.138143 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535854-lplm8"] Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.327724 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tthd\" (UniqueName: \"kubernetes.io/projected/d2676f22-56e0-46ed-83d0-4d29fc704155-kube-api-access-5tthd\") pod \"auto-csr-approver-29535854-lplm8\" (UID: \"d2676f22-56e0-46ed-83d0-4d29fc704155\") " pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.428288 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tthd\" (UniqueName: \"kubernetes.io/projected/d2676f22-56e0-46ed-83d0-4d29fc704155-kube-api-access-5tthd\") pod \"auto-csr-approver-29535854-lplm8\" (UID: \"d2676f22-56e0-46ed-83d0-4d29fc704155\") " pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.451930 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tthd\" (UniqueName: \"kubernetes.io/projected/d2676f22-56e0-46ed-83d0-4d29fc704155-kube-api-access-5tthd\") pod \"auto-csr-approver-29535854-lplm8\" (UID: \"d2676f22-56e0-46ed-83d0-4d29fc704155\") " pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:01 crc kubenswrapper[4781]: I0227 00:14:01.066124 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:01 crc kubenswrapper[4781]: I0227 00:14:01.263866 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535854-lplm8"] Feb 27 00:14:01 crc kubenswrapper[4781]: I0227 00:14:01.428885 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535854-lplm8" event={"ID":"d2676f22-56e0-46ed-83d0-4d29fc704155","Type":"ContainerStarted","Data":"ede374a6e9d1e538be7d431f4a287918634521cca633711915570e82f4a64bea"} Feb 27 00:14:03 crc kubenswrapper[4781]: I0227 00:14:03.440773 4781 generic.go:334] "Generic (PLEG): container finished" podID="d2676f22-56e0-46ed-83d0-4d29fc704155" containerID="7ea50ff483bc5e473c8ac4484b625c2d3aca274594f654dad11472e0c517581a" exitCode=0 Feb 27 00:14:03 crc kubenswrapper[4781]: I0227 00:14:03.440835 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535854-lplm8" event={"ID":"d2676f22-56e0-46ed-83d0-4d29fc704155","Type":"ContainerDied","Data":"7ea50ff483bc5e473c8ac4484b625c2d3aca274594f654dad11472e0c517581a"} Feb 27 00:14:04 crc kubenswrapper[4781]: I0227 00:14:04.666881 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:04 crc kubenswrapper[4781]: I0227 00:14:04.778425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tthd\" (UniqueName: \"kubernetes.io/projected/d2676f22-56e0-46ed-83d0-4d29fc704155-kube-api-access-5tthd\") pod \"d2676f22-56e0-46ed-83d0-4d29fc704155\" (UID: \"d2676f22-56e0-46ed-83d0-4d29fc704155\") " Feb 27 00:14:04 crc kubenswrapper[4781]: I0227 00:14:04.785917 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2676f22-56e0-46ed-83d0-4d29fc704155-kube-api-access-5tthd" (OuterVolumeSpecName: "kube-api-access-5tthd") pod "d2676f22-56e0-46ed-83d0-4d29fc704155" (UID: "d2676f22-56e0-46ed-83d0-4d29fc704155"). InnerVolumeSpecName "kube-api-access-5tthd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:14:04 crc kubenswrapper[4781]: I0227 00:14:04.880130 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tthd\" (UniqueName: \"kubernetes.io/projected/d2676f22-56e0-46ed-83d0-4d29fc704155-kube-api-access-5tthd\") on node \"crc\" DevicePath \"\"" Feb 27 00:14:05 crc kubenswrapper[4781]: I0227 00:14:05.452698 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535854-lplm8" event={"ID":"d2676f22-56e0-46ed-83d0-4d29fc704155","Type":"ContainerDied","Data":"ede374a6e9d1e538be7d431f4a287918634521cca633711915570e82f4a64bea"} Feb 27 00:14:05 crc kubenswrapper[4781]: I0227 00:14:05.452742 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ede374a6e9d1e538be7d431f4a287918634521cca633711915570e82f4a64bea" Feb 27 00:14:05 crc kubenswrapper[4781]: I0227 00:14:05.452775 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:05 crc kubenswrapper[4781]: I0227 00:14:05.727228 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535848-ccctv"] Feb 27 00:14:05 crc kubenswrapper[4781]: I0227 00:14:05.735670 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535848-ccctv"] Feb 27 00:14:07 crc kubenswrapper[4781]: I0227 00:14:07.316961 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df035290-8e3c-422b-90ac-573b592defcf" path="/var/lib/kubelet/pods/df035290-8e3c-422b-90ac-573b592defcf/volumes" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.161338 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h"] Feb 27 00:15:00 crc kubenswrapper[4781]: E0227 00:15:00.162239 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2676f22-56e0-46ed-83d0-4d29fc704155" containerName="oc" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.162255 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2676f22-56e0-46ed-83d0-4d29fc704155" containerName="oc" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.162378 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2676f22-56e0-46ed-83d0-4d29fc704155" containerName="oc" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.162936 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.165679 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.165771 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.176284 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h"] Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.295720 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b44418-6039-4859-96ba-1442e52b290e-secret-volume\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.295938 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htw5p\" (UniqueName: \"kubernetes.io/projected/22b44418-6039-4859-96ba-1442e52b290e-kube-api-access-htw5p\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.296034 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b44418-6039-4859-96ba-1442e52b290e-config-volume\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.398034 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htw5p\" (UniqueName: \"kubernetes.io/projected/22b44418-6039-4859-96ba-1442e52b290e-kube-api-access-htw5p\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.398155 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b44418-6039-4859-96ba-1442e52b290e-config-volume\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.398224 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b44418-6039-4859-96ba-1442e52b290e-secret-volume\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.399664 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b44418-6039-4859-96ba-1442e52b290e-config-volume\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.411075 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b44418-6039-4859-96ba-1442e52b290e-secret-volume\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.421126 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htw5p\" (UniqueName: \"kubernetes.io/projected/22b44418-6039-4859-96ba-1442e52b290e-kube-api-access-htw5p\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.490123 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.737283 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h"] Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.864399 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" event={"ID":"22b44418-6039-4859-96ba-1442e52b290e","Type":"ContainerStarted","Data":"2e0f59f7e2c9a2e9d7f4c88bbf3022043ccf3ca7386d743bd5e15d7fdd6bdd78"} Feb 27 00:15:01 crc kubenswrapper[4781]: I0227 00:15:01.875094 4781 generic.go:334] "Generic (PLEG): container finished" podID="22b44418-6039-4859-96ba-1442e52b290e" containerID="1f1ef56dac2e7ed3023bb30987d569aec06c9a96b99c1e9e939085397f33ecaf" exitCode=0 Feb 27 00:15:01 crc kubenswrapper[4781]: I0227 00:15:01.875140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" event={"ID":"22b44418-6039-4859-96ba-1442e52b290e","Type":"ContainerDied","Data":"1f1ef56dac2e7ed3023bb30987d569aec06c9a96b99c1e9e939085397f33ecaf"} Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.117839 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.236048 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htw5p\" (UniqueName: \"kubernetes.io/projected/22b44418-6039-4859-96ba-1442e52b290e-kube-api-access-htw5p\") pod \"22b44418-6039-4859-96ba-1442e52b290e\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.236233 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b44418-6039-4859-96ba-1442e52b290e-config-volume\") pod \"22b44418-6039-4859-96ba-1442e52b290e\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.236362 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b44418-6039-4859-96ba-1442e52b290e-secret-volume\") pod \"22b44418-6039-4859-96ba-1442e52b290e\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.237658 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22b44418-6039-4859-96ba-1442e52b290e-config-volume" (OuterVolumeSpecName: "config-volume") pod "22b44418-6039-4859-96ba-1442e52b290e" (UID: "22b44418-6039-4859-96ba-1442e52b290e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.242522 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b44418-6039-4859-96ba-1442e52b290e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "22b44418-6039-4859-96ba-1442e52b290e" (UID: "22b44418-6039-4859-96ba-1442e52b290e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.242859 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b44418-6039-4859-96ba-1442e52b290e-kube-api-access-htw5p" (OuterVolumeSpecName: "kube-api-access-htw5p") pod "22b44418-6039-4859-96ba-1442e52b290e" (UID: "22b44418-6039-4859-96ba-1442e52b290e"). InnerVolumeSpecName "kube-api-access-htw5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.338212 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b44418-6039-4859-96ba-1442e52b290e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.338240 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b44418-6039-4859-96ba-1442e52b290e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.338250 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htw5p\" (UniqueName: \"kubernetes.io/projected/22b44418-6039-4859-96ba-1442e52b290e-kube-api-access-htw5p\") on node \"crc\" DevicePath \"\"" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.890388 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" event={"ID":"22b44418-6039-4859-96ba-1442e52b290e","Type":"ContainerDied","Data":"2e0f59f7e2c9a2e9d7f4c88bbf3022043ccf3ca7386d743bd5e15d7fdd6bdd78"} Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.890447 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e0f59f7e2c9a2e9d7f4c88bbf3022043ccf3ca7386d743bd5e15d7fdd6bdd78" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.890466 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:48 crc kubenswrapper[4781]: I0227 00:15:48.625207 4781 scope.go:117] "RemoveContainer" containerID="0c5e0439f18997d1945f8c92f69edded31054471dc31175a4e23307895e84fc9" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.149242 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535856-mznwl"] Feb 27 00:16:00 crc kubenswrapper[4781]: E0227 00:16:00.150174 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b44418-6039-4859-96ba-1442e52b290e" containerName="collect-profiles" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.150191 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b44418-6039-4859-96ba-1442e52b290e" containerName="collect-profiles" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.150316 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b44418-6039-4859-96ba-1442e52b290e" containerName="collect-profiles" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.150756 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.153205 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.153729 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.153870 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.159088 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535856-mznwl"] Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.229731 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j29wm\" (UniqueName: \"kubernetes.io/projected/778d83b2-2e0c-45b3-a296-aaba355c6427-kube-api-access-j29wm\") pod \"auto-csr-approver-29535856-mznwl\" (UID: \"778d83b2-2e0c-45b3-a296-aaba355c6427\") " pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.330355 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j29wm\" (UniqueName: \"kubernetes.io/projected/778d83b2-2e0c-45b3-a296-aaba355c6427-kube-api-access-j29wm\") pod \"auto-csr-approver-29535856-mznwl\" (UID: \"778d83b2-2e0c-45b3-a296-aaba355c6427\") " pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.364003 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j29wm\" (UniqueName: \"kubernetes.io/projected/778d83b2-2e0c-45b3-a296-aaba355c6427-kube-api-access-j29wm\") pod \"auto-csr-approver-29535856-mznwl\" (UID: \"778d83b2-2e0c-45b3-a296-aaba355c6427\") " pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.479272 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.739868 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535856-mznwl"] Feb 27 00:16:00 crc kubenswrapper[4781]: W0227 00:16:00.747831 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod778d83b2_2e0c_45b3_a296_aaba355c6427.slice/crio-9f7a921969dd474fa7e006e106226a59df03f75c725b3c79c4a135ad3cc753fc WatchSource:0}: Error finding container 9f7a921969dd474fa7e006e106226a59df03f75c725b3c79c4a135ad3cc753fc: Status 404 returned error can't find the container with id 9f7a921969dd474fa7e006e106226a59df03f75c725b3c79c4a135ad3cc753fc Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.750216 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:16:01 crc kubenswrapper[4781]: I0227 00:16:01.292441 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535856-mznwl" event={"ID":"778d83b2-2e0c-45b3-a296-aaba355c6427","Type":"ContainerStarted","Data":"9f7a921969dd474fa7e006e106226a59df03f75c725b3c79c4a135ad3cc753fc"} Feb 27 00:16:02 crc kubenswrapper[4781]: I0227 00:16:02.299591 4781 generic.go:334] "Generic (PLEG): container finished" podID="778d83b2-2e0c-45b3-a296-aaba355c6427" containerID="96bd641ff5c28b0d487d9f55a81f55a83bc758e496b0e0a0d2639cc8d0b260d5" exitCode=0 Feb 27 00:16:02 crc kubenswrapper[4781]: I0227 00:16:02.299689 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535856-mznwl" event={"ID":"778d83b2-2e0c-45b3-a296-aaba355c6427","Type":"ContainerDied","Data":"96bd641ff5c28b0d487d9f55a81f55a83bc758e496b0e0a0d2639cc8d0b260d5"} Feb 27 00:16:03 crc kubenswrapper[4781]: I0227 00:16:03.559024 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:03 crc kubenswrapper[4781]: I0227 00:16:03.677440 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j29wm\" (UniqueName: \"kubernetes.io/projected/778d83b2-2e0c-45b3-a296-aaba355c6427-kube-api-access-j29wm\") pod \"778d83b2-2e0c-45b3-a296-aaba355c6427\" (UID: \"778d83b2-2e0c-45b3-a296-aaba355c6427\") " Feb 27 00:16:03 crc kubenswrapper[4781]: I0227 00:16:03.684259 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/778d83b2-2e0c-45b3-a296-aaba355c6427-kube-api-access-j29wm" (OuterVolumeSpecName: "kube-api-access-j29wm") pod "778d83b2-2e0c-45b3-a296-aaba355c6427" (UID: "778d83b2-2e0c-45b3-a296-aaba355c6427"). InnerVolumeSpecName "kube-api-access-j29wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:16:03 crc kubenswrapper[4781]: I0227 00:16:03.779886 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j29wm\" (UniqueName: \"kubernetes.io/projected/778d83b2-2e0c-45b3-a296-aaba355c6427-kube-api-access-j29wm\") on node \"crc\" DevicePath \"\"" Feb 27 00:16:04 crc kubenswrapper[4781]: I0227 00:16:04.315071 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535856-mznwl" event={"ID":"778d83b2-2e0c-45b3-a296-aaba355c6427","Type":"ContainerDied","Data":"9f7a921969dd474fa7e006e106226a59df03f75c725b3c79c4a135ad3cc753fc"} Feb 27 00:16:04 crc kubenswrapper[4781]: I0227 00:16:04.315125 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f7a921969dd474fa7e006e106226a59df03f75c725b3c79c4a135ad3cc753fc" Feb 27 00:16:04 crc kubenswrapper[4781]: I0227 00:16:04.315195 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:04 crc kubenswrapper[4781]: I0227 00:16:04.618376 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535850-wzxmm"] Feb 27 00:16:04 crc kubenswrapper[4781]: I0227 00:16:04.621605 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535850-wzxmm"] Feb 27 00:16:05 crc kubenswrapper[4781]: I0227 00:16:05.335569 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6acff23f-a17a-4f43-a7d6-32c8ccf4b084" path="/var/lib/kubelet/pods/6acff23f-a17a-4f43-a7d6-32c8ccf4b084/volumes" Feb 27 00:16:12 crc kubenswrapper[4781]: I0227 00:16:12.896257 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:16:12 crc kubenswrapper[4781]: I0227 00:16:12.896826 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:16:42 crc kubenswrapper[4781]: I0227 00:16:42.895854 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:16:42 crc kubenswrapper[4781]: I0227 00:16:42.896448 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:16:48 crc kubenswrapper[4781]: I0227 00:16:48.659889 4781 scope.go:117] "RemoveContainer" containerID="313dbdb071dff64579864e870a0b09038434fbe0ef138af4cad66cd56ba9ca0d" Feb 27 00:16:48 crc kubenswrapper[4781]: I0227 00:16:48.713356 4781 scope.go:117] "RemoveContainer" containerID="ec7472b1d4abe3539fd2b9c6a74552c975f1e7a845d80d7f3684a0e55a838de1" Feb 27 00:16:48 crc kubenswrapper[4781]: I0227 00:16:48.732380 4781 scope.go:117] "RemoveContainer" containerID="7a5bc22436045a92f14d9e48387b73688e7285010edca28bce2bf80e2706ff98" Feb 27 00:16:48 crc kubenswrapper[4781]: I0227 00:16:48.757163 4781 scope.go:117] "RemoveContainer" containerID="a316b4241144a66af579b620906b51669485f94b0371b42e5c56ba88e48d2942" Feb 27 00:17:12 crc kubenswrapper[4781]: I0227 00:17:12.895056 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:17:12 crc kubenswrapper[4781]: I0227 00:17:12.895660 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:17:12 crc kubenswrapper[4781]: I0227 00:17:12.895712 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:17:12 crc kubenswrapper[4781]: I0227 00:17:12.896276 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98d9908780c17a21a4a701f7c994bde3e3fbb6ea911f1b4e11c3a27ce7db4d1d"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:17:12 crc kubenswrapper[4781]: I0227 00:17:12.896327 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://98d9908780c17a21a4a701f7c994bde3e3fbb6ea911f1b4e11c3a27ce7db4d1d" gracePeriod=600 Feb 27 00:17:13 crc kubenswrapper[4781]: I0227 00:17:13.320427 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="98d9908780c17a21a4a701f7c994bde3e3fbb6ea911f1b4e11c3a27ce7db4d1d" exitCode=0 Feb 27 00:17:13 crc kubenswrapper[4781]: I0227 00:17:13.328294 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"98d9908780c17a21a4a701f7c994bde3e3fbb6ea911f1b4e11c3a27ce7db4d1d"} Feb 27 00:17:13 crc kubenswrapper[4781]: I0227 00:17:13.328701 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"4a4838ae34a31bed19fe04c8cb77eb7ca161a34e4d168445bf5a5f93e91a959a"} Feb 27 00:17:13 crc kubenswrapper[4781]: I0227 00:17:13.328984 4781 scope.go:117] "RemoveContainer" containerID="0a9584e9887d3110a6a6d2ad5c5024fb38c734637c177fd2cbddb2eae4932cdc" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.565614 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9"] Feb 27 00:17:38 crc kubenswrapper[4781]: E0227 00:17:38.566432 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="778d83b2-2e0c-45b3-a296-aaba355c6427" containerName="oc" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.566448 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="778d83b2-2e0c-45b3-a296-aaba355c6427" containerName="oc" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.566570 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="778d83b2-2e0c-45b3-a296-aaba355c6427" containerName="oc" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.567484 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.569881 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.578209 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9"] Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.675641 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffqqr\" (UniqueName: \"kubernetes.io/projected/d6e87b6c-eb25-4485-b639-6181c0ad86c7-kube-api-access-ffqqr\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.675736 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.675832 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.777567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.777700 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.777802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffqqr\" (UniqueName: \"kubernetes.io/projected/d6e87b6c-eb25-4485-b639-6181c0ad86c7-kube-api-access-ffqqr\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.779015 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.779409 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.799240 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffqqr\" (UniqueName: \"kubernetes.io/projected/d6e87b6c-eb25-4485-b639-6181c0ad86c7-kube-api-access-ffqqr\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.896391 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:39 crc kubenswrapper[4781]: I0227 00:17:39.108249 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9"] Feb 27 00:17:39 crc kubenswrapper[4781]: I0227 00:17:39.503980 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" event={"ID":"d6e87b6c-eb25-4485-b639-6181c0ad86c7","Type":"ContainerStarted","Data":"f54b7e9e179332b1994e5d183e219982590ec13795f05a038610a4dda166e81a"} Feb 27 00:17:39 crc kubenswrapper[4781]: I0227 00:17:39.504020 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" event={"ID":"d6e87b6c-eb25-4485-b639-6181c0ad86c7","Type":"ContainerStarted","Data":"583dd8ad0d27e298ba7949507fe2a673ad3cc2c41e2f67d4ecf6a4498ef534cf"} Feb 27 00:17:40 crc kubenswrapper[4781]: I0227 00:17:40.520534 4781 generic.go:334] "Generic (PLEG): container finished" podID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerID="f54b7e9e179332b1994e5d183e219982590ec13795f05a038610a4dda166e81a" exitCode=0 Feb 27 00:17:40 crc kubenswrapper[4781]: I0227 00:17:40.520729 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" event={"ID":"d6e87b6c-eb25-4485-b639-6181c0ad86c7","Type":"ContainerDied","Data":"f54b7e9e179332b1994e5d183e219982590ec13795f05a038610a4dda166e81a"} Feb 27 00:17:41 crc kubenswrapper[4781]: I0227 00:17:41.527434 4781 generic.go:334] "Generic (PLEG): container finished" podID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerID="0e4334abb705666c2ccafacdc16e3c52ccf2d9fad5d1cd17b493c56925fc3ffc" exitCode=0 Feb 27 00:17:41 crc kubenswrapper[4781]: I0227 00:17:41.527582 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" event={"ID":"d6e87b6c-eb25-4485-b639-6181c0ad86c7","Type":"ContainerDied","Data":"0e4334abb705666c2ccafacdc16e3c52ccf2d9fad5d1cd17b493c56925fc3ffc"} Feb 27 00:17:42 crc kubenswrapper[4781]: I0227 00:17:42.543955 4781 generic.go:334] "Generic (PLEG): container finished" podID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerID="63aec967b7a56bbc27060a22108d12825ab75ccaabd6f9eda49c69490997e3e4" exitCode=0 Feb 27 00:17:42 crc kubenswrapper[4781]: I0227 00:17:42.543994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" event={"ID":"d6e87b6c-eb25-4485-b639-6181c0ad86c7","Type":"ContainerDied","Data":"63aec967b7a56bbc27060a22108d12825ab75ccaabd6f9eda49c69490997e3e4"} Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.814745 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.946441 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-util\") pod \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.946589 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-bundle\") pod \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.949249 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-bundle" (OuterVolumeSpecName: "bundle") pod "d6e87b6c-eb25-4485-b639-6181c0ad86c7" (UID: "d6e87b6c-eb25-4485-b639-6181c0ad86c7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.949309 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffqqr\" (UniqueName: \"kubernetes.io/projected/d6e87b6c-eb25-4485-b639-6181c0ad86c7-kube-api-access-ffqqr\") pod \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.949650 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.952034 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e87b6c-eb25-4485-b639-6181c0ad86c7-kube-api-access-ffqqr" (OuterVolumeSpecName: "kube-api-access-ffqqr") pod "d6e87b6c-eb25-4485-b639-6181c0ad86c7" (UID: "d6e87b6c-eb25-4485-b639-6181c0ad86c7"). InnerVolumeSpecName "kube-api-access-ffqqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.975941 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-util" (OuterVolumeSpecName: "util") pod "d6e87b6c-eb25-4485-b639-6181c0ad86c7" (UID: "d6e87b6c-eb25-4485-b639-6181c0ad86c7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:17:44 crc kubenswrapper[4781]: I0227 00:17:44.051327 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-util\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:44 crc kubenswrapper[4781]: I0227 00:17:44.051365 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffqqr\" (UniqueName: \"kubernetes.io/projected/d6e87b6c-eb25-4485-b639-6181c0ad86c7-kube-api-access-ffqqr\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:44 crc kubenswrapper[4781]: I0227 00:17:44.561569 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" event={"ID":"d6e87b6c-eb25-4485-b639-6181c0ad86c7","Type":"ContainerDied","Data":"583dd8ad0d27e298ba7949507fe2a673ad3cc2c41e2f67d4ecf6a4498ef534cf"} Feb 27 00:17:44 crc kubenswrapper[4781]: I0227 00:17:44.561659 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:44 crc kubenswrapper[4781]: I0227 00:17:44.561681 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="583dd8ad0d27e298ba7949507fe2a673ad3cc2c41e2f67d4ecf6a4498ef534cf" Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.891696 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d2zn6"] Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892529 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-controller" containerID="cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892587 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="nbdb" containerID="cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892657 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="northd" containerID="cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892696 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892709 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="sbdb" containerID="cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892727 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-node" containerID="cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892757 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-acl-logging" containerID="cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.938954 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" containerID="cri-o://ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" gracePeriod=30 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.185945 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/4.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.186515 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/3.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.188416 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovn-acl-logging/0.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.188988 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovn-controller/0.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.189426 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.244753 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jfhx4"] Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.244964 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kubecfg-setup" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.244979 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kubecfg-setup" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.244988 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.244994 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245001 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245009 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245016 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="util" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245022 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="util" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245029 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245034 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245044 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245050 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245059 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="northd" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245065 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="northd" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245074 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245082 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245089 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="extract" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245095 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="extract" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245105 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245113 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245126 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="sbdb" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245134 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="sbdb" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245147 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-node" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245155 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-node" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245163 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-acl-logging" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245170 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-acl-logging" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245178 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="nbdb" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245185 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="nbdb" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245192 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="pull" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245198 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="pull" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245372 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-acl-logging" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245385 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245392 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="nbdb" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245400 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="extract" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245408 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="northd" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245418 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245426 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245434 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245442 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="sbdb" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245451 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245463 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-node" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245580 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245588 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245721 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245902 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.247119 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325597 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-etc-openvswitch\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325676 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovn-node-metrics-cert\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325700 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-var-lib-openvswitch\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325715 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-systemd-units\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325737 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-systemd\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325734 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325751 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-netns\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325797 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325829 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325826 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325866 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325839 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-var-lib-cni-networks-ovn-kubernetes\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325913 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-ovn-kubernetes\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325936 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-openvswitch\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325963 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-script-lib\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325968 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325993 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326012 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-kubelet\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326041 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-ovn\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326072 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-log-socket\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326083 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326090 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-slash\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326125 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-node-log\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326159 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-env-overrides\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326187 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5qlg\" (UniqueName: \"kubernetes.io/projected/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-kube-api-access-r5qlg\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326212 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-config\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326291 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-netd\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326315 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-bin\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326459 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326526 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-log-socket\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326804 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-slash\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326926 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-var-lib-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327006 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-ovn\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326573 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326574 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-log-socket" (OuterVolumeSpecName: "log-socket") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326609 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-slash" (OuterVolumeSpecName: "host-slash") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326676 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326685 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-node-log" (OuterVolumeSpecName: "node-log") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326701 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326720 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326925 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327123 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-kubelet\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327180 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-systemd-units\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327227 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327244 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovnkube-config\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327271 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-env-overrides\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327290 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-node-log\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-etc-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327326 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327349 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-cni-bin\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327375 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-cni-netd\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327399 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovnkube-script-lib\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327451 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-run-netns\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327467 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-run-ovn-kubernetes\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327482 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-systemd\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327509 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58jmz\" (UniqueName: \"kubernetes.io/projected/78f87967-e9e0-4e6a-ab3b-2216e4272c02-kube-api-access-58jmz\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327538 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovn-node-metrics-cert\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327578 4781 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327589 4781 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-log-socket\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327599 4781 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-slash\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327609 4781 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-node-log\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327617 4781 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327638 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327647 4781 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327655 4781 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327664 4781 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327673 4781 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327681 4781 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327689 4781 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327697 4781 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327707 4781 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327715 4781 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327724 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327732 4781 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.337887 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-kube-api-access-r5qlg" (OuterVolumeSpecName: "kube-api-access-r5qlg") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "kube-api-access-r5qlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.338272 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.338301 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428294 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-cni-bin\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428550 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-cni-netd\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428574 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovnkube-script-lib\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428597 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-run-netns\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428610 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-run-ovn-kubernetes\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428644 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-systemd\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428667 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58jmz\" (UniqueName: \"kubernetes.io/projected/78f87967-e9e0-4e6a-ab3b-2216e4272c02-kube-api-access-58jmz\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428687 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovn-node-metrics-cert\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428721 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-log-socket\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428737 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-slash\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428753 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-var-lib-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428771 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-ovn\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428787 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-kubelet\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-systemd-units\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428822 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428836 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovnkube-config\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428874 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-env-overrides\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428888 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-node-log\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428901 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-etc-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428917 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428950 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5qlg\" (UniqueName: \"kubernetes.io/projected/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-kube-api-access-r5qlg\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428961 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428971 4781 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.429009 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428401 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-cni-bin\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.429049 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-cni-netd\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.429588 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovnkube-script-lib\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.429619 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-run-netns\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.429662 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-run-ovn-kubernetes\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.429682 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-systemd\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430293 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-systemd-units\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430372 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-log-socket\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-slash\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430423 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-var-lib-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430451 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-ovn\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430480 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-kubelet\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430993 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-env-overrides\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.431034 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.431457 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovnkube-config\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.431525 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-node-log\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.431557 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-etc-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.436397 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovn-node-metrics-cert\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.448067 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58jmz\" (UniqueName: \"kubernetes.io/projected/78f87967-e9e0-4e6a-ab3b-2216e4272c02-kube-api-access-58jmz\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.559987 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.597002 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/2.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.601820 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/1.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.601863 4781 generic.go:334] "Generic (PLEG): container finished" podID="9a6dd1e0-45ab-46f0-b298-d89e47aaeecb" containerID="a286864c68415e96f38bba630ac2325989837881e34a926c93977715f330a129" exitCode=2 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.601937 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerDied","Data":"a286864c68415e96f38bba630ac2325989837881e34a926c93977715f330a129"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.601983 4781 scope.go:117] "RemoveContainer" containerID="3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.602487 4781 scope.go:117] "RemoveContainer" containerID="a286864c68415e96f38bba630ac2325989837881e34a926c93977715f330a129" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.602692 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-tlstj_openshift-multus(9a6dd1e0-45ab-46f0-b298-d89e47aaeecb)\"" pod="openshift-multus/multus-tlstj" podUID="9a6dd1e0-45ab-46f0-b298-d89e47aaeecb" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.610748 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/4.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.611359 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/3.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.615760 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovn-acl-logging/0.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616224 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovn-controller/0.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616766 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" exitCode=2 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616794 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" exitCode=0 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616803 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" exitCode=0 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616812 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" exitCode=0 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616823 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" exitCode=0 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616831 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" exitCode=0 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616838 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" exitCode=143 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616846 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" exitCode=143 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616867 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617427 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617459 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617914 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617925 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617938 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617952 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617959 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617965 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617971 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617978 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617984 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617991 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617997 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618003 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618011 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618026 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618033 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618039 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618045 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618051 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618056 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618063 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618072 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618078 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618085 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618094 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618105 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618115 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618123 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618129 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618135 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618142 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618148 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618154 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618161 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618167 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618176 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"96af27195e73c8a72996dd4d8221316b5eec9c31c92a51b4fb0d127265c1c59f"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618185 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618193 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618200 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618206 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618212 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618218 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618224 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618230 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618235 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618241 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618349 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.661819 4781 scope.go:117] "RemoveContainer" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.704151 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.705298 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d2zn6"] Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.709888 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d2zn6"] Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.727949 4781 scope.go:117] "RemoveContainer" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.748274 4781 scope.go:117] "RemoveContainer" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.805566 4781 scope.go:117] "RemoveContainer" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.820041 4781 scope.go:117] "RemoveContainer" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.833718 4781 scope.go:117] "RemoveContainer" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.848573 4781 scope.go:117] "RemoveContainer" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.860793 4781 scope.go:117] "RemoveContainer" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.877838 4781 scope.go:117] "RemoveContainer" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.896876 4781 scope.go:117] "RemoveContainer" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.897481 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": container with ID starting with ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c not found: ID does not exist" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.897519 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} err="failed to get container status \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": rpc error: code = NotFound desc = could not find container \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": container with ID starting with ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.897557 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.898858 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": container with ID starting with ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867 not found: ID does not exist" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.898907 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} err="failed to get container status \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": rpc error: code = NotFound desc = could not find container \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": container with ID starting with ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.898941 4781 scope.go:117] "RemoveContainer" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.899265 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": container with ID starting with f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381 not found: ID does not exist" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.899300 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} err="failed to get container status \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": rpc error: code = NotFound desc = could not find container \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": container with ID starting with f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.899324 4781 scope.go:117] "RemoveContainer" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.899606 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": container with ID starting with 87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a not found: ID does not exist" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.899652 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} err="failed to get container status \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": rpc error: code = NotFound desc = could not find container \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": container with ID starting with 87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.899670 4781 scope.go:117] "RemoveContainer" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.899918 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": container with ID starting with abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a not found: ID does not exist" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.899941 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} err="failed to get container status \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": rpc error: code = NotFound desc = could not find container \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": container with ID starting with abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.899955 4781 scope.go:117] "RemoveContainer" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.900179 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": container with ID starting with 6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9 not found: ID does not exist" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.900210 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} err="failed to get container status \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": rpc error: code = NotFound desc = could not find container \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": container with ID starting with 6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.900230 4781 scope.go:117] "RemoveContainer" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.900456 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": container with ID starting with 49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87 not found: ID does not exist" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.900488 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} err="failed to get container status \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": rpc error: code = NotFound desc = could not find container \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": container with ID starting with 49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.900504 4781 scope.go:117] "RemoveContainer" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.900749 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": container with ID starting with 7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403 not found: ID does not exist" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.900789 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} err="failed to get container status \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": rpc error: code = NotFound desc = could not find container \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": container with ID starting with 7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.900817 4781 scope.go:117] "RemoveContainer" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.901112 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": container with ID starting with 4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d not found: ID does not exist" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.901142 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} err="failed to get container status \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": rpc error: code = NotFound desc = could not find container \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": container with ID starting with 4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.901161 4781 scope.go:117] "RemoveContainer" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.901727 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": container with ID starting with 34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4 not found: ID does not exist" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.901760 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} err="failed to get container status \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": rpc error: code = NotFound desc = could not find container \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": container with ID starting with 34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.901782 4781 scope.go:117] "RemoveContainer" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.902198 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} err="failed to get container status \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": rpc error: code = NotFound desc = could not find container \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": container with ID starting with ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.902220 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.902538 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} err="failed to get container status \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": rpc error: code = NotFound desc = could not find container \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": container with ID starting with ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.902558 4781 scope.go:117] "RemoveContainer" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.902919 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} err="failed to get container status \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": rpc error: code = NotFound desc = could not find container \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": container with ID starting with f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.902936 4781 scope.go:117] "RemoveContainer" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.903195 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} err="failed to get container status \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": rpc error: code = NotFound desc = could not find container \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": container with ID starting with 87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.903217 4781 scope.go:117] "RemoveContainer" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.903544 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} err="failed to get container status \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": rpc error: code = NotFound desc = could not find container \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": container with ID starting with abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.903569 4781 scope.go:117] "RemoveContainer" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.903834 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} err="failed to get container status \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": rpc error: code = NotFound desc = could not find container \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": container with ID starting with 6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.903857 4781 scope.go:117] "RemoveContainer" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.904085 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} err="failed to get container status \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": rpc error: code = NotFound desc = could not find container \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": container with ID starting with 49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.904104 4781 scope.go:117] "RemoveContainer" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.904328 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} err="failed to get container status \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": rpc error: code = NotFound desc = could not find container \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": container with ID starting with 7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.904350 4781 scope.go:117] "RemoveContainer" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.904779 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} err="failed to get container status \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": rpc error: code = NotFound desc = could not find container \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": container with ID starting with 4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.904825 4781 scope.go:117] "RemoveContainer" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905303 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} err="failed to get container status \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": rpc error: code = NotFound desc = could not find container \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": container with ID starting with 34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905330 4781 scope.go:117] "RemoveContainer" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905557 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} err="failed to get container status \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": rpc error: code = NotFound desc = could not find container \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": container with ID starting with ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905578 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905785 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} err="failed to get container status \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": rpc error: code = NotFound desc = could not find container \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": container with ID starting with ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905803 4781 scope.go:117] "RemoveContainer" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905997 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} err="failed to get container status \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": rpc error: code = NotFound desc = could not find container \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": container with ID starting with f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906037 4781 scope.go:117] "RemoveContainer" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906249 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} err="failed to get container status \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": rpc error: code = NotFound desc = could not find container \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": container with ID starting with 87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906280 4781 scope.go:117] "RemoveContainer" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906530 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} err="failed to get container status \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": rpc error: code = NotFound desc = could not find container \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": container with ID starting with abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906555 4781 scope.go:117] "RemoveContainer" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906765 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} err="failed to get container status \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": rpc error: code = NotFound desc = could not find container \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": container with ID starting with 6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906794 4781 scope.go:117] "RemoveContainer" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906989 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} err="failed to get container status \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": rpc error: code = NotFound desc = could not find container \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": container with ID starting with 49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907014 4781 scope.go:117] "RemoveContainer" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907200 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} err="failed to get container status \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": rpc error: code = NotFound desc = could not find container \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": container with ID starting with 7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907223 4781 scope.go:117] "RemoveContainer" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907518 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} err="failed to get container status \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": rpc error: code = NotFound desc = could not find container \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": container with ID starting with 4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907557 4781 scope.go:117] "RemoveContainer" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907797 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} err="failed to get container status \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": rpc error: code = NotFound desc = could not find container \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": container with ID starting with 34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907814 4781 scope.go:117] "RemoveContainer" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908021 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} err="failed to get container status \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": rpc error: code = NotFound desc = could not find container \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": container with ID starting with ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908059 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908237 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} err="failed to get container status \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": rpc error: code = NotFound desc = could not find container \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": container with ID starting with ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908264 4781 scope.go:117] "RemoveContainer" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908468 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} err="failed to get container status \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": rpc error: code = NotFound desc = could not find container \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": container with ID starting with f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908489 4781 scope.go:117] "RemoveContainer" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908728 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} err="failed to get container status \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": rpc error: code = NotFound desc = could not find container \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": container with ID starting with 87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908747 4781 scope.go:117] "RemoveContainer" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908934 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} err="failed to get container status \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": rpc error: code = NotFound desc = could not find container \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": container with ID starting with abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908949 4781 scope.go:117] "RemoveContainer" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.909403 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} err="failed to get container status \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": rpc error: code = NotFound desc = could not find container \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": container with ID starting with 6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.909458 4781 scope.go:117] "RemoveContainer" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.909740 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} err="failed to get container status \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": rpc error: code = NotFound desc = could not find container \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": container with ID starting with 49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.909763 4781 scope.go:117] "RemoveContainer" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.910097 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} err="failed to get container status \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": rpc error: code = NotFound desc = could not find container \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": container with ID starting with 7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.910121 4781 scope.go:117] "RemoveContainer" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.910537 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} err="failed to get container status \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": rpc error: code = NotFound desc = could not find container \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": container with ID starting with 4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.910561 4781 scope.go:117] "RemoveContainer" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.910924 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} err="failed to get container status \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": rpc error: code = NotFound desc = could not find container \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": container with ID starting with 34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4 not found: ID does not exist" Feb 27 00:17:51 crc kubenswrapper[4781]: I0227 00:17:51.315548 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" path="/var/lib/kubelet/pods/12a87c22-b4e1-4aa9-8b3e-a34f7d159239/volumes" Feb 27 00:17:51 crc kubenswrapper[4781]: I0227 00:17:51.640418 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/2.log" Feb 27 00:17:51 crc kubenswrapper[4781]: I0227 00:17:51.646954 4781 generic.go:334] "Generic (PLEG): container finished" podID="78f87967-e9e0-4e6a-ab3b-2216e4272c02" containerID="4779bf1ca393254a54cc03c243bf87d6a37de3a0aba2f25a6bd06c83d56ea5f0" exitCode=0 Feb 27 00:17:51 crc kubenswrapper[4781]: I0227 00:17:51.646989 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerDied","Data":"4779bf1ca393254a54cc03c243bf87d6a37de3a0aba2f25a6bd06c83d56ea5f0"} Feb 27 00:17:51 crc kubenswrapper[4781]: I0227 00:17:51.647009 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"31e65d716a1aa553f7a005b056f0c642a16e5d75ee05277a14fb364aca8ff0b2"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.655153 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"16fec0c4cec4317659e931ec067a32dbfee38a3efb068b50ec5c22d2ca58f2da"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.655743 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"84cb6e9827d8d3a53a7e151312e0866cba75754fe5342a5148d2c122359894fd"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.655756 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"c95d092c213b49db3f38593dd71b479dd138d69deb902eacf58674dbae8096e2"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.655765 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"eda00b6b03c60717614c97557953d6fba2f73eca8083836778766458999f9c0f"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.655775 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"b44befc8472b8b3fd3153503b3fa4f58b28a70c1b7695d6ff97ab13081d3feb7"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.655783 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"9f2591640a73dda244101bcceab9811e32c1623f492cb472232e7dad03b03a6a"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.913620 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr"] Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.914490 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.917583 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-l667z" Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.918233 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.919264 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.027867 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7"] Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.028659 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.030248 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-s4267" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.030586 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.036005 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m"] Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.036667 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.061525 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwd7g\" (UniqueName: \"kubernetes.io/projected/c62f5f48-b15f-4d70-837c-a05addc48839-kube-api-access-zwd7g\") pod \"obo-prometheus-operator-68bc856cb9-rbdmr\" (UID: \"c62f5f48-b15f-4d70-837c-a05addc48839\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.153883 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-m6jxs"] Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.154519 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.156778 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-79fv9" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.156778 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.162386 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbb658fa-808d-4c87-b81e-63863f31382f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m\" (UID: \"cbb658fa-808d-4c87-b81e-63863f31382f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.162450 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwd7g\" (UniqueName: \"kubernetes.io/projected/c62f5f48-b15f-4d70-837c-a05addc48839-kube-api-access-zwd7g\") pod \"obo-prometheus-operator-68bc856cb9-rbdmr\" (UID: \"c62f5f48-b15f-4d70-837c-a05addc48839\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.162541 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5abff2aa-f9cb-469e-9a7e-7a6eea64d4db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7\" (UID: \"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.162807 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbb658fa-808d-4c87-b81e-63863f31382f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m\" (UID: \"cbb658fa-808d-4c87-b81e-63863f31382f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.162856 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5abff2aa-f9cb-469e-9a7e-7a6eea64d4db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7\" (UID: \"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.186572 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwd7g\" (UniqueName: \"kubernetes.io/projected/c62f5f48-b15f-4d70-837c-a05addc48839-kube-api-access-zwd7g\") pod \"obo-prometheus-operator-68bc856cb9-rbdmr\" (UID: \"c62f5f48-b15f-4d70-837c-a05addc48839\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.231219 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.254197 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(acf9c3d2dd034d323469dc4151d393eab24946aa7c15575d65ae985cb320fe12): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.254288 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(acf9c3d2dd034d323469dc4151d393eab24946aa7c15575d65ae985cb320fe12): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.254316 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(acf9c3d2dd034d323469dc4151d393eab24946aa7c15575d65ae985cb320fe12): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.254372 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators(c62f5f48-b15f-4d70-837c-a05addc48839)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators(c62f5f48-b15f-4d70-837c-a05addc48839)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(acf9c3d2dd034d323469dc4151d393eab24946aa7c15575d65ae985cb320fe12): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" podUID="c62f5f48-b15f-4d70-837c-a05addc48839" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.263828 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fe8e5f0-6c7b-42bd-9604-85a90477d143-observability-operator-tls\") pod \"observability-operator-59bdc8b94-m6jxs\" (UID: \"3fe8e5f0-6c7b-42bd-9604-85a90477d143\") " pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.263894 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9nmx\" (UniqueName: \"kubernetes.io/projected/3fe8e5f0-6c7b-42bd-9604-85a90477d143-kube-api-access-v9nmx\") pod \"observability-operator-59bdc8b94-m6jxs\" (UID: \"3fe8e5f0-6c7b-42bd-9604-85a90477d143\") " pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.263925 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5abff2aa-f9cb-469e-9a7e-7a6eea64d4db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7\" (UID: \"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.263983 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbb658fa-808d-4c87-b81e-63863f31382f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m\" (UID: \"cbb658fa-808d-4c87-b81e-63863f31382f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.264004 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5abff2aa-f9cb-469e-9a7e-7a6eea64d4db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7\" (UID: \"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.264035 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbb658fa-808d-4c87-b81e-63863f31382f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m\" (UID: \"cbb658fa-808d-4c87-b81e-63863f31382f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.269031 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbb658fa-808d-4c87-b81e-63863f31382f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m\" (UID: \"cbb658fa-808d-4c87-b81e-63863f31382f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.269116 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbb658fa-808d-4c87-b81e-63863f31382f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m\" (UID: \"cbb658fa-808d-4c87-b81e-63863f31382f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.269381 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5abff2aa-f9cb-469e-9a7e-7a6eea64d4db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7\" (UID: \"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.271283 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5abff2aa-f9cb-469e-9a7e-7a6eea64d4db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7\" (UID: \"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.271905 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-l5ppf"] Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.272535 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.275785 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6zvft" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.346876 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.354417 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.365452 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fe8e5f0-6c7b-42bd-9604-85a90477d143-observability-operator-tls\") pod \"observability-operator-59bdc8b94-m6jxs\" (UID: \"3fe8e5f0-6c7b-42bd-9604-85a90477d143\") " pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.365506 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxpsj\" (UniqueName: \"kubernetes.io/projected/1a3a6a15-797e-4cfe-8e21-3a813460012d-kube-api-access-nxpsj\") pod \"perses-operator-5bf474d74f-l5ppf\" (UID: \"1a3a6a15-797e-4cfe-8e21-3a813460012d\") " pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.365538 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9nmx\" (UniqueName: \"kubernetes.io/projected/3fe8e5f0-6c7b-42bd-9604-85a90477d143-kube-api-access-v9nmx\") pod \"observability-operator-59bdc8b94-m6jxs\" (UID: \"3fe8e5f0-6c7b-42bd-9604-85a90477d143\") " pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.365577 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a3a6a15-797e-4cfe-8e21-3a813460012d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-l5ppf\" (UID: \"1a3a6a15-797e-4cfe-8e21-3a813460012d\") " pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.368312 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fe8e5f0-6c7b-42bd-9604-85a90477d143-observability-operator-tls\") pod \"observability-operator-59bdc8b94-m6jxs\" (UID: \"3fe8e5f0-6c7b-42bd-9604-85a90477d143\") " pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.387075 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9nmx\" (UniqueName: \"kubernetes.io/projected/3fe8e5f0-6c7b-42bd-9604-85a90477d143-kube-api-access-v9nmx\") pod \"observability-operator-59bdc8b94-m6jxs\" (UID: \"3fe8e5f0-6c7b-42bd-9604-85a90477d143\") " pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.392588 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(b9ad5d44391eec9fc267745f98887580747de5faa3ec900c41841eae19661de2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.392696 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(b9ad5d44391eec9fc267745f98887580747de5faa3ec900c41841eae19661de2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.392724 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(b9ad5d44391eec9fc267745f98887580747de5faa3ec900c41841eae19661de2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.392784 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators(5abff2aa-f9cb-469e-9a7e-7a6eea64d4db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators(5abff2aa-f9cb-469e-9a7e-7a6eea64d4db)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(b9ad5d44391eec9fc267745f98887580747de5faa3ec900c41841eae19661de2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" podUID="5abff2aa-f9cb-469e-9a7e-7a6eea64d4db" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.399453 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(e9a3f918348b5c9dd835803ad19aab2cb18d72467945e7b9a49e7cca04aed52c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.399500 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(e9a3f918348b5c9dd835803ad19aab2cb18d72467945e7b9a49e7cca04aed52c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.399523 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(e9a3f918348b5c9dd835803ad19aab2cb18d72467945e7b9a49e7cca04aed52c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.399565 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators(cbb658fa-808d-4c87-b81e-63863f31382f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators(cbb658fa-808d-4c87-b81e-63863f31382f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(e9a3f918348b5c9dd835803ad19aab2cb18d72467945e7b9a49e7cca04aed52c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" podUID="cbb658fa-808d-4c87-b81e-63863f31382f" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.466386 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a3a6a15-797e-4cfe-8e21-3a813460012d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-l5ppf\" (UID: \"1a3a6a15-797e-4cfe-8e21-3a813460012d\") " pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.466459 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxpsj\" (UniqueName: \"kubernetes.io/projected/1a3a6a15-797e-4cfe-8e21-3a813460012d-kube-api-access-nxpsj\") pod \"perses-operator-5bf474d74f-l5ppf\" (UID: \"1a3a6a15-797e-4cfe-8e21-3a813460012d\") " pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.467403 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a3a6a15-797e-4cfe-8e21-3a813460012d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-l5ppf\" (UID: \"1a3a6a15-797e-4cfe-8e21-3a813460012d\") " pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.467684 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.490191 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(c9439ac03a626440c4a67097ff2d14fb741850684f6c8e5171ba7751d2e17119): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.490245 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(c9439ac03a626440c4a67097ff2d14fb741850684f6c8e5171ba7751d2e17119): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.490263 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(c9439ac03a626440c4a67097ff2d14fb741850684f6c8e5171ba7751d2e17119): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.490298 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-m6jxs_openshift-operators(3fe8e5f0-6c7b-42bd-9604-85a90477d143)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-m6jxs_openshift-operators(3fe8e5f0-6c7b-42bd-9604-85a90477d143)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(c9439ac03a626440c4a67097ff2d14fb741850684f6c8e5171ba7751d2e17119): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" podUID="3fe8e5f0-6c7b-42bd-9604-85a90477d143" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.490370 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxpsj\" (UniqueName: \"kubernetes.io/projected/1a3a6a15-797e-4cfe-8e21-3a813460012d-kube-api-access-nxpsj\") pod \"perses-operator-5bf474d74f-l5ppf\" (UID: \"1a3a6a15-797e-4cfe-8e21-3a813460012d\") " pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.609016 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.629078 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(a1316e086242627f09e5957be3f514dad12cd3073dc2d380126c3fa8511f7666): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.629147 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(a1316e086242627f09e5957be3f514dad12cd3073dc2d380126c3fa8511f7666): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.629170 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(a1316e086242627f09e5957be3f514dad12cd3073dc2d380126c3fa8511f7666): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.629216 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-l5ppf_openshift-operators(1a3a6a15-797e-4cfe-8e21-3a813460012d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-l5ppf_openshift-operators(1a3a6a15-797e-4cfe-8e21-3a813460012d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(a1316e086242627f09e5957be3f514dad12cd3073dc2d380126c3fa8511f7666): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" podUID="1a3a6a15-797e-4cfe-8e21-3a813460012d" Feb 27 00:17:54 crc kubenswrapper[4781]: I0227 00:17:54.668798 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"75adc4ae198a13b0195a243660d4297b8663ce118937ac9ddf788b89c83b01b8"} Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.695986 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"51116f40c35ad454a2ffd20e3c51ff29acacefc1e89326249603a688f8c6e13a"} Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.696401 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.696414 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.727734 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" podStartSLOduration=7.727717015 podStartE2EDuration="7.727717015s" podCreationTimestamp="2026-02-27 00:17:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:17:57.727042699 +0000 UTC m=+746.984582253" watchObservedRunningTime="2026-02-27 00:17:57.727717015 +0000 UTC m=+746.985256569" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.735945 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.823765 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr"] Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.823872 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.824182 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.840258 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-m6jxs"] Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.840367 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.840818 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.848811 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7"] Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.862573 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.863310 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.881463 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(3e2d4f447c77304e54d5e9ab985166aa513fdb1b35003b643f92150cfbc07400): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.881581 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(3e2d4f447c77304e54d5e9ab985166aa513fdb1b35003b643f92150cfbc07400): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.881611 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(3e2d4f447c77304e54d5e9ab985166aa513fdb1b35003b643f92150cfbc07400): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.881716 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators(c62f5f48-b15f-4d70-837c-a05addc48839)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators(c62f5f48-b15f-4d70-837c-a05addc48839)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(3e2d4f447c77304e54d5e9ab985166aa513fdb1b35003b643f92150cfbc07400): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" podUID="c62f5f48-b15f-4d70-837c-a05addc48839" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.893717 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m"] Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.893819 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.894073 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.899886 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-l5ppf"] Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.900013 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.900565 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.913948 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(3d17b4f254747363491e0f9b5b7184aed40a0a05daa08d692096524b65a07108): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.914084 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(3d17b4f254747363491e0f9b5b7184aed40a0a05daa08d692096524b65a07108): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.914112 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(3d17b4f254747363491e0f9b5b7184aed40a0a05daa08d692096524b65a07108): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.914167 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators(5abff2aa-f9cb-469e-9a7e-7a6eea64d4db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators(5abff2aa-f9cb-469e-9a7e-7a6eea64d4db)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(3d17b4f254747363491e0f9b5b7184aed40a0a05daa08d692096524b65a07108): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" podUID="5abff2aa-f9cb-469e-9a7e-7a6eea64d4db" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.922931 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b8535a2fdff6659d45bcf9b47e61354b9449438e65a8c33e4182da8d8c3d277d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.922988 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b8535a2fdff6659d45bcf9b47e61354b9449438e65a8c33e4182da8d8c3d277d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.923012 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b8535a2fdff6659d45bcf9b47e61354b9449438e65a8c33e4182da8d8c3d277d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.923052 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-m6jxs_openshift-operators(3fe8e5f0-6c7b-42bd-9604-85a90477d143)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-m6jxs_openshift-operators(3fe8e5f0-6c7b-42bd-9604-85a90477d143)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b8535a2fdff6659d45bcf9b47e61354b9449438e65a8c33e4182da8d8c3d277d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" podUID="3fe8e5f0-6c7b-42bd-9604-85a90477d143" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.940066 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(9f3f595c1473a4901329f59296c9a978e61e3ea75aed3a4f5f2d0034bd3424be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.940132 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(9f3f595c1473a4901329f59296c9a978e61e3ea75aed3a4f5f2d0034bd3424be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.940153 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(9f3f595c1473a4901329f59296c9a978e61e3ea75aed3a4f5f2d0034bd3424be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.940198 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators(cbb658fa-808d-4c87-b81e-63863f31382f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators(cbb658fa-808d-4c87-b81e-63863f31382f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(9f3f595c1473a4901329f59296c9a978e61e3ea75aed3a4f5f2d0034bd3424be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" podUID="cbb658fa-808d-4c87-b81e-63863f31382f" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.958910 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(016693864dfc49d232fe7737deef5dea5375af5b1e910ee34530ac943cc9c15b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.958987 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(016693864dfc49d232fe7737deef5dea5375af5b1e910ee34530ac943cc9c15b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.959012 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(016693864dfc49d232fe7737deef5dea5375af5b1e910ee34530ac943cc9c15b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.959067 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-l5ppf_openshift-operators(1a3a6a15-797e-4cfe-8e21-3a813460012d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-l5ppf_openshift-operators(1a3a6a15-797e-4cfe-8e21-3a813460012d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(016693864dfc49d232fe7737deef5dea5375af5b1e910ee34530ac943cc9c15b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" podUID="1a3a6a15-797e-4cfe-8e21-3a813460012d" Feb 27 00:17:58 crc kubenswrapper[4781]: I0227 00:17:58.700411 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:58 crc kubenswrapper[4781]: I0227 00:17:58.759000 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.123785 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535858-9fs8d"] Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.125156 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.126898 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.126941 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.128160 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.133410 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535858-9fs8d"] Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.262526 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl65t\" (UniqueName: \"kubernetes.io/projected/3bb1e1bd-28ea-42f4-96d5-534db2674e68-kube-api-access-zl65t\") pod \"auto-csr-approver-29535858-9fs8d\" (UID: \"3bb1e1bd-28ea-42f4-96d5-534db2674e68\") " pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.363745 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl65t\" (UniqueName: \"kubernetes.io/projected/3bb1e1bd-28ea-42f4-96d5-534db2674e68-kube-api-access-zl65t\") pod \"auto-csr-approver-29535858-9fs8d\" (UID: \"3bb1e1bd-28ea-42f4-96d5-534db2674e68\") " pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.386105 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl65t\" (UniqueName: \"kubernetes.io/projected/3bb1e1bd-28ea-42f4-96d5-534db2674e68-kube-api-access-zl65t\") pod \"auto-csr-approver-29535858-9fs8d\" (UID: \"3bb1e1bd-28ea-42f4-96d5-534db2674e68\") " pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.456893 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.483809 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(a6422c13a22f0fd16b8f4c41ee16e0830dd21d1c8ffb88c80497e4e894942082): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.483896 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(a6422c13a22f0fd16b8f4c41ee16e0830dd21d1c8ffb88c80497e4e894942082): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.483933 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(a6422c13a22f0fd16b8f4c41ee16e0830dd21d1c8ffb88c80497e4e894942082): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.483976 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29535858-9fs8d_openshift-infra(3bb1e1bd-28ea-42f4-96d5-534db2674e68)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29535858-9fs8d_openshift-infra(3bb1e1bd-28ea-42f4-96d5-534db2674e68)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(a6422c13a22f0fd16b8f4c41ee16e0830dd21d1c8ffb88c80497e4e894942082): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" podUID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.709099 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.709508 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.742427 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(d9421be3c1dae616d2afee4d7ac46bfab378f5bcb3f8439d332a0506ef9e5834): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.742484 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(d9421be3c1dae616d2afee4d7ac46bfab378f5bcb3f8439d332a0506ef9e5834): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.742504 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(d9421be3c1dae616d2afee4d7ac46bfab378f5bcb3f8439d332a0506ef9e5834): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.742539 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29535858-9fs8d_openshift-infra(3bb1e1bd-28ea-42f4-96d5-534db2674e68)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29535858-9fs8d_openshift-infra(3bb1e1bd-28ea-42f4-96d5-534db2674e68)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(d9421be3c1dae616d2afee4d7ac46bfab378f5bcb3f8439d332a0506ef9e5834): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" podUID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" Feb 27 00:18:03 crc kubenswrapper[4781]: I0227 00:18:03.308970 4781 scope.go:117] "RemoveContainer" containerID="a286864c68415e96f38bba630ac2325989837881e34a926c93977715f330a129" Feb 27 00:18:03 crc kubenswrapper[4781]: E0227 00:18:03.309427 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-tlstj_openshift-multus(9a6dd1e0-45ab-46f0-b298-d89e47aaeecb)\"" pod="openshift-multus/multus-tlstj" podUID="9a6dd1e0-45ab-46f0-b298-d89e47aaeecb" Feb 27 00:18:08 crc kubenswrapper[4781]: I0227 00:18:08.308434 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:18:08 crc kubenswrapper[4781]: I0227 00:18:08.309183 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:18:08 crc kubenswrapper[4781]: E0227 00:18:08.338728 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(f5e9e5e75916086dbc8cc15d24649fe4e7be5140046b48e992dd72871eb24e01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:08 crc kubenswrapper[4781]: E0227 00:18:08.339375 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(f5e9e5e75916086dbc8cc15d24649fe4e7be5140046b48e992dd72871eb24e01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:18:08 crc kubenswrapper[4781]: E0227 00:18:08.339414 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(f5e9e5e75916086dbc8cc15d24649fe4e7be5140046b48e992dd72871eb24e01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:18:08 crc kubenswrapper[4781]: E0227 00:18:08.339463 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators(cbb658fa-808d-4c87-b81e-63863f31382f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators(cbb658fa-808d-4c87-b81e-63863f31382f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(f5e9e5e75916086dbc8cc15d24649fe4e7be5140046b48e992dd72871eb24e01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" podUID="cbb658fa-808d-4c87-b81e-63863f31382f" Feb 27 00:18:09 crc kubenswrapper[4781]: I0227 00:18:09.308850 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:18:09 crc kubenswrapper[4781]: I0227 00:18:09.308893 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:18:09 crc kubenswrapper[4781]: I0227 00:18:09.309005 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:09 crc kubenswrapper[4781]: I0227 00:18:09.309436 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:18:09 crc kubenswrapper[4781]: I0227 00:18:09.309707 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:09 crc kubenswrapper[4781]: I0227 00:18:09.309783 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.353782 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(6559c0ff8f272afb500e316558ef1183a20ceb233ac20054686125ad4553fed9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.353899 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(6559c0ff8f272afb500e316558ef1183a20ceb233ac20054686125ad4553fed9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.353923 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(6559c0ff8f272afb500e316558ef1183a20ceb233ac20054686125ad4553fed9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.353985 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators(c62f5f48-b15f-4d70-837c-a05addc48839)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators(c62f5f48-b15f-4d70-837c-a05addc48839)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(6559c0ff8f272afb500e316558ef1183a20ceb233ac20054686125ad4553fed9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" podUID="c62f5f48-b15f-4d70-837c-a05addc48839" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.358212 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(0d64a666439c90f739396fc0b07da83d309c4ba88ae567bac4aaa83b4bffcd61): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.358271 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(0d64a666439c90f739396fc0b07da83d309c4ba88ae567bac4aaa83b4bffcd61): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.358291 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(0d64a666439c90f739396fc0b07da83d309c4ba88ae567bac4aaa83b4bffcd61): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.358339 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-l5ppf_openshift-operators(1a3a6a15-797e-4cfe-8e21-3a813460012d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-l5ppf_openshift-operators(1a3a6a15-797e-4cfe-8e21-3a813460012d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(0d64a666439c90f739396fc0b07da83d309c4ba88ae567bac4aaa83b4bffcd61): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" podUID="1a3a6a15-797e-4cfe-8e21-3a813460012d" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.362888 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(feef5134aede6983b37f1d89f1bd220bcfd8c9a6654973fda44ec2b9bcf0cc4f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.363023 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(feef5134aede6983b37f1d89f1bd220bcfd8c9a6654973fda44ec2b9bcf0cc4f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.363122 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(feef5134aede6983b37f1d89f1bd220bcfd8c9a6654973fda44ec2b9bcf0cc4f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.363252 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators(5abff2aa-f9cb-469e-9a7e-7a6eea64d4db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators(5abff2aa-f9cb-469e-9a7e-7a6eea64d4db)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(feef5134aede6983b37f1d89f1bd220bcfd8c9a6654973fda44ec2b9bcf0cc4f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" podUID="5abff2aa-f9cb-469e-9a7e-7a6eea64d4db" Feb 27 00:18:10 crc kubenswrapper[4781]: I0227 00:18:10.309340 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:10 crc kubenswrapper[4781]: I0227 00:18:10.309837 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:10 crc kubenswrapper[4781]: E0227 00:18:10.334093 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b3a07f20b55adde8b5c8d332535c3de5d766b2de7cdd6e78706dadbf116fc5dc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:10 crc kubenswrapper[4781]: E0227 00:18:10.334239 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b3a07f20b55adde8b5c8d332535c3de5d766b2de7cdd6e78706dadbf116fc5dc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:10 crc kubenswrapper[4781]: E0227 00:18:10.334316 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b3a07f20b55adde8b5c8d332535c3de5d766b2de7cdd6e78706dadbf116fc5dc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:10 crc kubenswrapper[4781]: E0227 00:18:10.334412 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-m6jxs_openshift-operators(3fe8e5f0-6c7b-42bd-9604-85a90477d143)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-m6jxs_openshift-operators(3fe8e5f0-6c7b-42bd-9604-85a90477d143)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b3a07f20b55adde8b5c8d332535c3de5d766b2de7cdd6e78706dadbf116fc5dc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" podUID="3fe8e5f0-6c7b-42bd-9604-85a90477d143" Feb 27 00:18:14 crc kubenswrapper[4781]: I0227 00:18:14.308823 4781 scope.go:117] "RemoveContainer" containerID="a286864c68415e96f38bba630ac2325989837881e34a926c93977715f330a129" Feb 27 00:18:14 crc kubenswrapper[4781]: I0227 00:18:14.788967 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/2.log" Feb 27 00:18:14 crc kubenswrapper[4781]: I0227 00:18:14.789309 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerStarted","Data":"e8389900490984b4abb551a07e748ac9856ad8ae2f3078b664efd67dcaf5090c"} Feb 27 00:18:16 crc kubenswrapper[4781]: I0227 00:18:16.309081 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:16 crc kubenswrapper[4781]: I0227 00:18:16.309924 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:16 crc kubenswrapper[4781]: I0227 00:18:16.543313 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535858-9fs8d"] Feb 27 00:18:16 crc kubenswrapper[4781]: W0227 00:18:16.548222 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bb1e1bd_28ea_42f4_96d5_534db2674e68.slice/crio-5753f73223e139dd23552ed49014d8187711d2c8fe095b49b3d277fc5cfa3bb6 WatchSource:0}: Error finding container 5753f73223e139dd23552ed49014d8187711d2c8fe095b49b3d277fc5cfa3bb6: Status 404 returned error can't find the container with id 5753f73223e139dd23552ed49014d8187711d2c8fe095b49b3d277fc5cfa3bb6 Feb 27 00:18:16 crc kubenswrapper[4781]: I0227 00:18:16.799073 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" event={"ID":"3bb1e1bd-28ea-42f4-96d5-534db2674e68","Type":"ContainerStarted","Data":"5753f73223e139dd23552ed49014d8187711d2c8fe095b49b3d277fc5cfa3bb6"} Feb 27 00:18:17 crc kubenswrapper[4781]: I0227 00:18:17.806295 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" event={"ID":"3bb1e1bd-28ea-42f4-96d5-534db2674e68","Type":"ContainerStarted","Data":"86bad95d795a7faf37cb19be6e8217786d2cabd57a047f7210f59250bf6bee2f"} Feb 27 00:18:17 crc kubenswrapper[4781]: I0227 00:18:17.820100 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" podStartSLOduration=17.047292824 podStartE2EDuration="17.820079712s" podCreationTimestamp="2026-02-27 00:18:00 +0000 UTC" firstStartedPulling="2026-02-27 00:18:16.551528091 +0000 UTC m=+765.809067645" lastFinishedPulling="2026-02-27 00:18:17.324314939 +0000 UTC m=+766.581854533" observedRunningTime="2026-02-27 00:18:17.816605646 +0000 UTC m=+767.074145210" watchObservedRunningTime="2026-02-27 00:18:17.820079712 +0000 UTC m=+767.077619276" Feb 27 00:18:18 crc kubenswrapper[4781]: I0227 00:18:18.815357 4781 generic.go:334] "Generic (PLEG): container finished" podID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" containerID="86bad95d795a7faf37cb19be6e8217786d2cabd57a047f7210f59250bf6bee2f" exitCode=0 Feb 27 00:18:18 crc kubenswrapper[4781]: I0227 00:18:18.815416 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" event={"ID":"3bb1e1bd-28ea-42f4-96d5-534db2674e68","Type":"ContainerDied","Data":"86bad95d795a7faf37cb19be6e8217786d2cabd57a047f7210f59250bf6bee2f"} Feb 27 00:18:20 crc kubenswrapper[4781]: I0227 00:18:20.309267 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:18:20 crc kubenswrapper[4781]: I0227 00:18:20.309881 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:18:20 crc kubenswrapper[4781]: I0227 00:18:20.582448 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:18:20 crc kubenswrapper[4781]: I0227 00:18:20.707460 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m"] Feb 27 00:18:20 crc kubenswrapper[4781]: W0227 00:18:20.718835 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb658fa_808d_4c87_b81e_63863f31382f.slice/crio-7c438ce7c5544007d38c6cd9e1c6987673991fe6e99ffa2fb11d9d759b0abc72 WatchSource:0}: Error finding container 7c438ce7c5544007d38c6cd9e1c6987673991fe6e99ffa2fb11d9d759b0abc72: Status 404 returned error can't find the container with id 7c438ce7c5544007d38c6cd9e1c6987673991fe6e99ffa2fb11d9d759b0abc72 Feb 27 00:18:20 crc kubenswrapper[4781]: I0227 00:18:20.827476 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" event={"ID":"cbb658fa-808d-4c87-b81e-63863f31382f","Type":"ContainerStarted","Data":"7c438ce7c5544007d38c6cd9e1c6987673991fe6e99ffa2fb11d9d759b0abc72"} Feb 27 00:18:20 crc kubenswrapper[4781]: I0227 00:18:20.967259 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.031413 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl65t\" (UniqueName: \"kubernetes.io/projected/3bb1e1bd-28ea-42f4-96d5-534db2674e68-kube-api-access-zl65t\") pod \"3bb1e1bd-28ea-42f4-96d5-534db2674e68\" (UID: \"3bb1e1bd-28ea-42f4-96d5-534db2674e68\") " Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.038888 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb1e1bd-28ea-42f4-96d5-534db2674e68-kube-api-access-zl65t" (OuterVolumeSpecName: "kube-api-access-zl65t") pod "3bb1e1bd-28ea-42f4-96d5-534db2674e68" (UID: "3bb1e1bd-28ea-42f4-96d5-534db2674e68"). InnerVolumeSpecName "kube-api-access-zl65t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.132429 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl65t\" (UniqueName: \"kubernetes.io/projected/3bb1e1bd-28ea-42f4-96d5-534db2674e68-kube-api-access-zl65t\") on node \"crc\" DevicePath \"\"" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.309248 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.321898 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.737786 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7"] Feb 27 00:18:21 crc kubenswrapper[4781]: W0227 00:18:21.744707 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5abff2aa_f9cb_469e_9a7e_7a6eea64d4db.slice/crio-793ec7f20ef3785eba07b672c006673b2739fb315e82e8c929f71449c39bd1b4 WatchSource:0}: Error finding container 793ec7f20ef3785eba07b672c006673b2739fb315e82e8c929f71449c39bd1b4: Status 404 returned error can't find the container with id 793ec7f20ef3785eba07b672c006673b2739fb315e82e8c929f71449c39bd1b4 Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.833517 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" event={"ID":"3bb1e1bd-28ea-42f4-96d5-534db2674e68","Type":"ContainerDied","Data":"5753f73223e139dd23552ed49014d8187711d2c8fe095b49b3d277fc5cfa3bb6"} Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.833579 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5753f73223e139dd23552ed49014d8187711d2c8fe095b49b3d277fc5cfa3bb6" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.833985 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.835226 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" event={"ID":"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db","Type":"ContainerStarted","Data":"793ec7f20ef3785eba07b672c006673b2739fb315e82e8c929f71449c39bd1b4"} Feb 27 00:18:22 crc kubenswrapper[4781]: I0227 00:18:22.036746 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535852-49dfn"] Feb 27 00:18:22 crc kubenswrapper[4781]: I0227 00:18:22.040530 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535852-49dfn"] Feb 27 00:18:23 crc kubenswrapper[4781]: I0227 00:18:23.310853 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:18:23 crc kubenswrapper[4781]: I0227 00:18:23.311538 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:18:23 crc kubenswrapper[4781]: I0227 00:18:23.316500 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96ecbd6e-c579-40ca-a5bf-9876777721f9" path="/var/lib/kubelet/pods/96ecbd6e-c579-40ca-a5bf-9876777721f9/volumes" Feb 27 00:18:23 crc kubenswrapper[4781]: I0227 00:18:23.698667 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr"] Feb 27 00:18:24 crc kubenswrapper[4781]: I0227 00:18:24.309224 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:24 crc kubenswrapper[4781]: I0227 00:18:24.309749 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:25 crc kubenswrapper[4781]: W0227 00:18:25.092341 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc62f5f48_b15f_4d70_837c_a05addc48839.slice/crio-437727a4cb9884dae84df7c6049639908951feb4238c0dfe9b2c0ae180a02a8f WatchSource:0}: Error finding container 437727a4cb9884dae84df7c6049639908951feb4238c0dfe9b2c0ae180a02a8f: Status 404 returned error can't find the container with id 437727a4cb9884dae84df7c6049639908951feb4238c0dfe9b2c0ae180a02a8f Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.286897 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-l5ppf"] Feb 27 00:18:25 crc kubenswrapper[4781]: W0227 00:18:25.291114 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a3a6a15_797e_4cfe_8e21_3a813460012d.slice/crio-be4d989c5d40629c9ffc9d258e8cc02cceb9e3af824894dd87ad3b8e8676f558 WatchSource:0}: Error finding container be4d989c5d40629c9ffc9d258e8cc02cceb9e3af824894dd87ad3b8e8676f558: Status 404 returned error can't find the container with id be4d989c5d40629c9ffc9d258e8cc02cceb9e3af824894dd87ad3b8e8676f558 Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.309367 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.309847 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.763018 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-m6jxs"] Feb 27 00:18:25 crc kubenswrapper[4781]: W0227 00:18:25.768914 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe8e5f0_6c7b_42bd_9604_85a90477d143.slice/crio-955c9789b0bffc9aab82cb525b68f5f6f853e560a21e4930ac4fc52cb75812da WatchSource:0}: Error finding container 955c9789b0bffc9aab82cb525b68f5f6f853e560a21e4930ac4fc52cb75812da: Status 404 returned error can't find the container with id 955c9789b0bffc9aab82cb525b68f5f6f853e560a21e4930ac4fc52cb75812da Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.878697 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" event={"ID":"cbb658fa-808d-4c87-b81e-63863f31382f","Type":"ContainerStarted","Data":"20271a79cce7f288f3687287e215ae8c9185ee9c5d926a46a261cc4110ea2da5"} Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.880717 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" event={"ID":"3fe8e5f0-6c7b-42bd-9604-85a90477d143","Type":"ContainerStarted","Data":"955c9789b0bffc9aab82cb525b68f5f6f853e560a21e4930ac4fc52cb75812da"} Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.882168 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" event={"ID":"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db","Type":"ContainerStarted","Data":"a2fceaf7e740c7abd736297d999fdb938fda61424b439e9b244cba064b1fa240"} Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.883548 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" event={"ID":"c62f5f48-b15f-4d70-837c-a05addc48839","Type":"ContainerStarted","Data":"437727a4cb9884dae84df7c6049639908951feb4238c0dfe9b2c0ae180a02a8f"} Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.884842 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" event={"ID":"1a3a6a15-797e-4cfe-8e21-3a813460012d","Type":"ContainerStarted","Data":"be4d989c5d40629c9ffc9d258e8cc02cceb9e3af824894dd87ad3b8e8676f558"} Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.915284 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" podStartSLOduration=28.441178941 podStartE2EDuration="32.915261722s" podCreationTimestamp="2026-02-27 00:17:53 +0000 UTC" firstStartedPulling="2026-02-27 00:18:20.721315406 +0000 UTC m=+769.978854960" lastFinishedPulling="2026-02-27 00:18:25.195398177 +0000 UTC m=+774.452937741" observedRunningTime="2026-02-27 00:18:25.90219526 +0000 UTC m=+775.159734824" watchObservedRunningTime="2026-02-27 00:18:25.915261722 +0000 UTC m=+775.172801286" Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.927229 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" podStartSLOduration=29.505241674 podStartE2EDuration="32.927202516s" podCreationTimestamp="2026-02-27 00:17:53 +0000 UTC" firstStartedPulling="2026-02-27 00:18:21.751132696 +0000 UTC m=+771.008672250" lastFinishedPulling="2026-02-27 00:18:25.173093538 +0000 UTC m=+774.430633092" observedRunningTime="2026-02-27 00:18:25.924869969 +0000 UTC m=+775.182409543" watchObservedRunningTime="2026-02-27 00:18:25.927202516 +0000 UTC m=+775.184742110" Feb 27 00:18:28 crc kubenswrapper[4781]: I0227 00:18:28.899844 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" event={"ID":"c62f5f48-b15f-4d70-837c-a05addc48839","Type":"ContainerStarted","Data":"5777fade9968bc3391c3e077c86044a6b6d19da1ff33ed02c9e469efeedf7926"} Feb 27 00:18:28 crc kubenswrapper[4781]: I0227 00:18:28.904726 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" event={"ID":"1a3a6a15-797e-4cfe-8e21-3a813460012d","Type":"ContainerStarted","Data":"593e252543ed3cf9f71943b58ff742bcafa500a84e583914feb39957ef52fa7e"} Feb 27 00:18:28 crc kubenswrapper[4781]: I0227 00:18:28.904893 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:28 crc kubenswrapper[4781]: I0227 00:18:28.927721 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" podStartSLOduration=34.066664131 podStartE2EDuration="36.927692805s" podCreationTimestamp="2026-02-27 00:17:52 +0000 UTC" firstStartedPulling="2026-02-27 00:18:25.098303035 +0000 UTC m=+774.355842629" lastFinishedPulling="2026-02-27 00:18:27.959331749 +0000 UTC m=+777.216871303" observedRunningTime="2026-02-27 00:18:28.920333283 +0000 UTC m=+778.177872837" watchObservedRunningTime="2026-02-27 00:18:28.927692805 +0000 UTC m=+778.185232389" Feb 27 00:18:28 crc kubenswrapper[4781]: I0227 00:18:28.942114 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" podStartSLOduration=33.272588564 podStartE2EDuration="35.94209555s" podCreationTimestamp="2026-02-27 00:17:53 +0000 UTC" firstStartedPulling="2026-02-27 00:18:25.293312329 +0000 UTC m=+774.550851893" lastFinishedPulling="2026-02-27 00:18:27.962819325 +0000 UTC m=+777.220358879" observedRunningTime="2026-02-27 00:18:28.941935046 +0000 UTC m=+778.199474680" watchObservedRunningTime="2026-02-27 00:18:28.94209555 +0000 UTC m=+778.199635114" Feb 27 00:18:30 crc kubenswrapper[4781]: I0227 00:18:30.931263 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" event={"ID":"3fe8e5f0-6c7b-42bd-9604-85a90477d143","Type":"ContainerStarted","Data":"3d7061a7f9a3a53dbea34bab246ba3a79a5c25b34f0e8fab1915465df308be59"} Feb 27 00:18:30 crc kubenswrapper[4781]: I0227 00:18:30.931765 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:30 crc kubenswrapper[4781]: I0227 00:18:30.959481 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" podStartSLOduration=33.330624535 podStartE2EDuration="37.959455349s" podCreationTimestamp="2026-02-27 00:17:53 +0000 UTC" firstStartedPulling="2026-02-27 00:18:25.771578142 +0000 UTC m=+775.029117706" lastFinishedPulling="2026-02-27 00:18:30.400408956 +0000 UTC m=+779.657948520" observedRunningTime="2026-02-27 00:18:30.956933057 +0000 UTC m=+780.214472641" watchObservedRunningTime="2026-02-27 00:18:30.959455349 +0000 UTC m=+780.216994973" Feb 27 00:18:30 crc kubenswrapper[4781]: I0227 00:18:30.982306 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:33 crc kubenswrapper[4781]: I0227 00:18:33.611225 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.345959 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-95z7d"] Feb 27 00:18:37 crc kubenswrapper[4781]: E0227 00:18:37.346166 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" containerName="oc" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.346178 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" containerName="oc" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.346278 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" containerName="oc" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.346645 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.348539 4781 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wgssp" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.348601 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.351771 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.361216 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-95z7d"] Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.370350 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-mwpvm"] Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.371019 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-mwpvm" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.372680 4781 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bmw8p" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.388769 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rwwkv"] Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.389506 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.390996 4781 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-b574w" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.398076 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-mwpvm"] Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.426460 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rwwkv"] Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.476590 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz4dx\" (UniqueName: \"kubernetes.io/projected/af9e6ffa-5ea0-473d-9e75-a2715093490f-kube-api-access-bz4dx\") pod \"cert-manager-cainjector-cf98fcc89-95z7d\" (UID: \"af9e6ffa-5ea0-473d-9e75-a2715093490f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.476770 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brf6d\" (UniqueName: \"kubernetes.io/projected/749ed3fc-65b7-4674-a1b1-0433692d2d89-kube-api-access-brf6d\") pod \"cert-manager-858654f9db-mwpvm\" (UID: \"749ed3fc-65b7-4674-a1b1-0433692d2d89\") " pod="cert-manager/cert-manager-858654f9db-mwpvm" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.476890 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwnzs\" (UniqueName: \"kubernetes.io/projected/b732ab89-7ea1-4378-9511-229ee7fa787f-kube-api-access-qwnzs\") pod \"cert-manager-webhook-687f57d79b-rwwkv\" (UID: \"b732ab89-7ea1-4378-9511-229ee7fa787f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.578193 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwnzs\" (UniqueName: \"kubernetes.io/projected/b732ab89-7ea1-4378-9511-229ee7fa787f-kube-api-access-qwnzs\") pod \"cert-manager-webhook-687f57d79b-rwwkv\" (UID: \"b732ab89-7ea1-4378-9511-229ee7fa787f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.578472 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz4dx\" (UniqueName: \"kubernetes.io/projected/af9e6ffa-5ea0-473d-9e75-a2715093490f-kube-api-access-bz4dx\") pod \"cert-manager-cainjector-cf98fcc89-95z7d\" (UID: \"af9e6ffa-5ea0-473d-9e75-a2715093490f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.578523 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brf6d\" (UniqueName: \"kubernetes.io/projected/749ed3fc-65b7-4674-a1b1-0433692d2d89-kube-api-access-brf6d\") pod \"cert-manager-858654f9db-mwpvm\" (UID: \"749ed3fc-65b7-4674-a1b1-0433692d2d89\") " pod="cert-manager/cert-manager-858654f9db-mwpvm" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.603362 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz4dx\" (UniqueName: \"kubernetes.io/projected/af9e6ffa-5ea0-473d-9e75-a2715093490f-kube-api-access-bz4dx\") pod \"cert-manager-cainjector-cf98fcc89-95z7d\" (UID: \"af9e6ffa-5ea0-473d-9e75-a2715093490f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.604262 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwnzs\" (UniqueName: \"kubernetes.io/projected/b732ab89-7ea1-4378-9511-229ee7fa787f-kube-api-access-qwnzs\") pod \"cert-manager-webhook-687f57d79b-rwwkv\" (UID: \"b732ab89-7ea1-4378-9511-229ee7fa787f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.604893 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brf6d\" (UniqueName: \"kubernetes.io/projected/749ed3fc-65b7-4674-a1b1-0433692d2d89-kube-api-access-brf6d\") pod \"cert-manager-858654f9db-mwpvm\" (UID: \"749ed3fc-65b7-4674-a1b1-0433692d2d89\") " pod="cert-manager/cert-manager-858654f9db-mwpvm" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.658879 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.724721 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-mwpvm" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.736104 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.936746 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-95z7d"] Feb 27 00:18:37 crc kubenswrapper[4781]: W0227 00:18:37.986440 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf9e6ffa_5ea0_473d_9e75_a2715093490f.slice/crio-e6e7329a2c7276542a8c6bd7df5c4d934edd7576f427102979fb231ad7e7bc04 WatchSource:0}: Error finding container e6e7329a2c7276542a8c6bd7df5c4d934edd7576f427102979fb231ad7e7bc04: Status 404 returned error can't find the container with id e6e7329a2c7276542a8c6bd7df5c4d934edd7576f427102979fb231ad7e7bc04 Feb 27 00:18:38 crc kubenswrapper[4781]: I0227 00:18:38.146368 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-mwpvm"] Feb 27 00:18:38 crc kubenswrapper[4781]: I0227 00:18:38.274957 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rwwkv"] Feb 27 00:18:38 crc kubenswrapper[4781]: I0227 00:18:38.979050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" event={"ID":"af9e6ffa-5ea0-473d-9e75-a2715093490f","Type":"ContainerStarted","Data":"e6e7329a2c7276542a8c6bd7df5c4d934edd7576f427102979fb231ad7e7bc04"} Feb 27 00:18:38 crc kubenswrapper[4781]: I0227 00:18:38.980085 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-mwpvm" event={"ID":"749ed3fc-65b7-4674-a1b1-0433692d2d89","Type":"ContainerStarted","Data":"15542c648efe93c26b17bc9bbb1df17d6db9b229ceedac117dc458cdd98987e1"} Feb 27 00:18:38 crc kubenswrapper[4781]: I0227 00:18:38.980729 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" event={"ID":"b732ab89-7ea1-4378-9511-229ee7fa787f","Type":"ContainerStarted","Data":"c9abdf86c1a1108c4f4b029cf3c330a6dd7c3586bd21f0a2c9e9c7bd4dc31d29"} Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.009144 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" event={"ID":"af9e6ffa-5ea0-473d-9e75-a2715093490f","Type":"ContainerStarted","Data":"190fb6ed70375b908ee29b76f2325b6774ab4fbf5600c1747aa43895ad26e05a"} Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.010695 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-mwpvm" event={"ID":"749ed3fc-65b7-4674-a1b1-0433692d2d89","Type":"ContainerStarted","Data":"1fadddfe605cb826ecea402b69310323c2f9bbbd528bd35706eae7b2d853bbda"} Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.012495 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" event={"ID":"b732ab89-7ea1-4378-9511-229ee7fa787f","Type":"ContainerStarted","Data":"5bc2cd1877bd267dc02009ca544c492d56d95e8421237a70276854b347449883"} Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.012739 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.027111 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" podStartSLOduration=2.244173228 podStartE2EDuration="7.027087866s" podCreationTimestamp="2026-02-27 00:18:37 +0000 UTC" firstStartedPulling="2026-02-27 00:18:37.992929083 +0000 UTC m=+787.250468637" lastFinishedPulling="2026-02-27 00:18:42.775843681 +0000 UTC m=+792.033383275" observedRunningTime="2026-02-27 00:18:44.024931963 +0000 UTC m=+793.282471517" watchObservedRunningTime="2026-02-27 00:18:44.027087866 +0000 UTC m=+793.284627420" Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.044422 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" podStartSLOduration=2.468197448 podStartE2EDuration="7.044404823s" podCreationTimestamp="2026-02-27 00:18:37 +0000 UTC" firstStartedPulling="2026-02-27 00:18:38.280350994 +0000 UTC m=+787.537890548" lastFinishedPulling="2026-02-27 00:18:42.856558349 +0000 UTC m=+792.114097923" observedRunningTime="2026-02-27 00:18:44.043367447 +0000 UTC m=+793.300907021" watchObservedRunningTime="2026-02-27 00:18:44.044404823 +0000 UTC m=+793.301944377" Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.073401 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-mwpvm" podStartSLOduration=2.452774427 podStartE2EDuration="7.073376516s" podCreationTimestamp="2026-02-27 00:18:37 +0000 UTC" firstStartedPulling="2026-02-27 00:18:38.157084457 +0000 UTC m=+787.414624011" lastFinishedPulling="2026-02-27 00:18:42.777686546 +0000 UTC m=+792.035226100" observedRunningTime="2026-02-27 00:18:44.066287552 +0000 UTC m=+793.323827116" watchObservedRunningTime="2026-02-27 00:18:44.073376516 +0000 UTC m=+793.330916080" Feb 27 00:18:48 crc kubenswrapper[4781]: I0227 00:18:48.869504 4781 scope.go:117] "RemoveContainer" containerID="5a1ffc2079241a21de7cc919695abf3baba7e2af15f91ad7d2c4786574ddb8a4" Feb 27 00:18:52 crc kubenswrapper[4781]: I0227 00:18:52.739139 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:19:16 crc kubenswrapper[4781]: I0227 00:19:16.555908 4781 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.785579 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft"] Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.786813 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.791601 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.800965 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft"] Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.829801 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.829878 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.829934 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt87t\" (UniqueName: \"kubernetes.io/projected/b41e2a48-4103-4cf3-be92-92180cbb2510-kube-api-access-gt87t\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.931299 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.931412 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.931510 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt87t\" (UniqueName: \"kubernetes.io/projected/b41e2a48-4103-4cf3-be92-92180cbb2510-kube-api-access-gt87t\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.931956 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.932208 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.951828 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt87t\" (UniqueName: \"kubernetes.io/projected/b41e2a48-4103-4cf3-be92-92180cbb2510-kube-api-access-gt87t\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:21 crc kubenswrapper[4781]: I0227 00:19:21.101657 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:21 crc kubenswrapper[4781]: I0227 00:19:21.609874 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft"] Feb 27 00:19:22 crc kubenswrapper[4781]: I0227 00:19:22.301562 4781 generic.go:334] "Generic (PLEG): container finished" podID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerID="bc7287992fd979e253afb46a18f7f269a613b396ec959609a258dd64fecc5b22" exitCode=0 Feb 27 00:19:22 crc kubenswrapper[4781]: I0227 00:19:22.301657 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" event={"ID":"b41e2a48-4103-4cf3-be92-92180cbb2510","Type":"ContainerDied","Data":"bc7287992fd979e253afb46a18f7f269a613b396ec959609a258dd64fecc5b22"} Feb 27 00:19:22 crc kubenswrapper[4781]: I0227 00:19:22.301728 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" event={"ID":"b41e2a48-4103-4cf3-be92-92180cbb2510","Type":"ContainerStarted","Data":"7d2eca92294c9202715d81184c0e75a00c33a2fd2d34bb6cf07794bd3af6de5d"} Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.161702 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cxq4v"] Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.165962 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.205344 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.206456 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.212371 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.212899 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.218711 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxq4v"] Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.229524 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.289773 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-utilities\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.289814 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-catalog-content\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.289887 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2742\" (UniqueName: \"kubernetes.io/projected/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-kube-api-access-v2742\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.390934 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-utilities\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.390990 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") " pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.391011 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-catalog-content\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.391057 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2742\" (UniqueName: \"kubernetes.io/projected/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-kube-api-access-v2742\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.391101 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtv88\" (UniqueName: \"kubernetes.io/projected/a0aec676-41f4-4855-a823-2a3b21cbe197-kube-api-access-wtv88\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") " pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.391757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-utilities\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.391763 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-catalog-content\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.419204 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2742\" (UniqueName: \"kubernetes.io/projected/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-kube-api-access-v2742\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.492337 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtv88\" (UniqueName: \"kubernetes.io/projected/a0aec676-41f4-4855-a823-2a3b21cbe197-kube-api-access-wtv88\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") " pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.492480 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") " pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.498368 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.498405 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/68e97b9e042bbbb942f612a5d441bf2a1f903b9844cb564189e4809c6edaf34d/globalmount\"" pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.511346 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtv88\" (UniqueName: \"kubernetes.io/projected/a0aec676-41f4-4855-a823-2a3b21cbe197-kube-api-access-wtv88\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") " pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.518999 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") " pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.529973 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.536379 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.824004 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 27 00:19:23 crc kubenswrapper[4781]: W0227 00:19:23.827731 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0aec676_41f4_4855_a823_2a3b21cbe197.slice/crio-a1050fb5b012640328c623fbff0bbdc041549c7d210a82dfd0c46ed2ad129dc6 WatchSource:0}: Error finding container a1050fb5b012640328c623fbff0bbdc041549c7d210a82dfd0c46ed2ad129dc6: Status 404 returned error can't find the container with id a1050fb5b012640328c623fbff0bbdc041549c7d210a82dfd0c46ed2ad129dc6 Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.041153 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxq4v"] Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.334291 4781 generic.go:334] "Generic (PLEG): container finished" podID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerID="7aaedda8a4b57bd598af91df724a3a264b47d5539f95c9ab714487d376f1cf73" exitCode=0 Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.334360 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" event={"ID":"b41e2a48-4103-4cf3-be92-92180cbb2510","Type":"ContainerDied","Data":"7aaedda8a4b57bd598af91df724a3a264b47d5539f95c9ab714487d376f1cf73"} Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.341165 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"a0aec676-41f4-4855-a823-2a3b21cbe197","Type":"ContainerStarted","Data":"a1050fb5b012640328c623fbff0bbdc041549c7d210a82dfd0c46ed2ad129dc6"} Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.343421 4781 generic.go:334] "Generic (PLEG): container finished" podID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerID="c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a" exitCode=0 Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.343446 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerDied","Data":"c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a"} Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.343466 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerStarted","Data":"b3755082d116f31051d75af049033b929644572e35832ae06d2d8defd5e385bf"} Feb 27 00:19:25 crc kubenswrapper[4781]: I0227 00:19:25.352412 4781 generic.go:334] "Generic (PLEG): container finished" podID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerID="9dea4a634c964f942cc980c671d7554dfbcca7841dbc0cac25ce15174d9577d9" exitCode=0 Feb 27 00:19:25 crc kubenswrapper[4781]: I0227 00:19:25.352473 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" event={"ID":"b41e2a48-4103-4cf3-be92-92180cbb2510","Type":"ContainerDied","Data":"9dea4a634c964f942cc980c671d7554dfbcca7841dbc0cac25ce15174d9577d9"} Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.652366 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.741508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt87t\" (UniqueName: \"kubernetes.io/projected/b41e2a48-4103-4cf3-be92-92180cbb2510-kube-api-access-gt87t\") pod \"b41e2a48-4103-4cf3-be92-92180cbb2510\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.741922 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-util\") pod \"b41e2a48-4103-4cf3-be92-92180cbb2510\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.742113 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-bundle\") pod \"b41e2a48-4103-4cf3-be92-92180cbb2510\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.743777 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-bundle" (OuterVolumeSpecName: "bundle") pod "b41e2a48-4103-4cf3-be92-92180cbb2510" (UID: "b41e2a48-4103-4cf3-be92-92180cbb2510"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.747781 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b41e2a48-4103-4cf3-be92-92180cbb2510-kube-api-access-gt87t" (OuterVolumeSpecName: "kube-api-access-gt87t") pod "b41e2a48-4103-4cf3-be92-92180cbb2510" (UID: "b41e2a48-4103-4cf3-be92-92180cbb2510"). InnerVolumeSpecName "kube-api-access-gt87t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.756422 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-util" (OuterVolumeSpecName: "util") pod "b41e2a48-4103-4cf3-be92-92180cbb2510" (UID: "b41e2a48-4103-4cf3-be92-92180cbb2510"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.843397 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.843432 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt87t\" (UniqueName: \"kubernetes.io/projected/b41e2a48-4103-4cf3-be92-92180cbb2510-kube-api-access-gt87t\") on node \"crc\" DevicePath \"\"" Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.843444 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-util\") on node \"crc\" DevicePath \"\"" Feb 27 00:19:27 crc kubenswrapper[4781]: I0227 00:19:27.364948 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"a0aec676-41f4-4855-a823-2a3b21cbe197","Type":"ContainerStarted","Data":"21bbec858d72f4a89b6f19914c34e02ea4d0991ee8b28d1b141c8826b9f5bd89"} Feb 27 00:19:27 crc kubenswrapper[4781]: I0227 00:19:27.367216 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerStarted","Data":"fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28"} Feb 27 00:19:27 crc kubenswrapper[4781]: I0227 00:19:27.369252 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" event={"ID":"b41e2a48-4103-4cf3-be92-92180cbb2510","Type":"ContainerDied","Data":"7d2eca92294c9202715d81184c0e75a00c33a2fd2d34bb6cf07794bd3af6de5d"} Feb 27 00:19:27 crc kubenswrapper[4781]: I0227 00:19:27.369280 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d2eca92294c9202715d81184c0e75a00c33a2fd2d34bb6cf07794bd3af6de5d" Feb 27 00:19:27 crc kubenswrapper[4781]: I0227 00:19:27.369320 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:27 crc kubenswrapper[4781]: I0227 00:19:27.381917 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.277095506 podStartE2EDuration="7.381896885s" podCreationTimestamp="2026-02-27 00:19:20 +0000 UTC" firstStartedPulling="2026-02-27 00:19:23.829871438 +0000 UTC m=+833.087410992" lastFinishedPulling="2026-02-27 00:19:26.934672817 +0000 UTC m=+836.192212371" observedRunningTime="2026-02-27 00:19:27.377127127 +0000 UTC m=+836.634666691" watchObservedRunningTime="2026-02-27 00:19:27.381896885 +0000 UTC m=+836.639436449" Feb 27 00:19:28 crc kubenswrapper[4781]: I0227 00:19:28.379749 4781 generic.go:334] "Generic (PLEG): container finished" podID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerID="fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28" exitCode=0 Feb 27 00:19:28 crc kubenswrapper[4781]: I0227 00:19:28.379819 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerDied","Data":"fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28"} Feb 27 00:19:29 crc kubenswrapper[4781]: I0227 00:19:29.387135 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerStarted","Data":"3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae"} Feb 27 00:19:29 crc kubenswrapper[4781]: I0227 00:19:29.414645 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cxq4v" podStartSLOduration=2.007851489 podStartE2EDuration="6.414603141s" podCreationTimestamp="2026-02-27 00:19:23 +0000 UTC" firstStartedPulling="2026-02-27 00:19:24.349571602 +0000 UTC m=+833.607111156" lastFinishedPulling="2026-02-27 00:19:28.756323254 +0000 UTC m=+838.013862808" observedRunningTime="2026-02-27 00:19:29.412448728 +0000 UTC m=+838.669988272" watchObservedRunningTime="2026-02-27 00:19:29.414603141 +0000 UTC m=+838.672142695" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.531737 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.532278 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.579072 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj"] Feb 27 00:19:33 crc kubenswrapper[4781]: E0227 00:19:33.579290 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="pull" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.579306 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="pull" Feb 27 00:19:33 crc kubenswrapper[4781]: E0227 00:19:33.579319 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="extract" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.579325 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="extract" Feb 27 00:19:33 crc kubenswrapper[4781]: E0227 00:19:33.579333 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="util" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.579340 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="util" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.579449 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="extract" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.579988 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.582592 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.582695 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.583271 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-7krlt" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.583284 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.583549 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.587424 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.598061 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj"] Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.739178 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcjsq\" (UniqueName: \"kubernetes.io/projected/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-kube-api-access-pcjsq\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.739254 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-webhook-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.739302 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.739346 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-manager-config\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.739467 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-apiservice-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.840708 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-apiservice-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.840830 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcjsq\" (UniqueName: \"kubernetes.io/projected/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-kube-api-access-pcjsq\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.840864 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-webhook-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.840893 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.840940 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-manager-config\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.842381 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-manager-config\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.849850 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-apiservice-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.850870 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.861342 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcjsq\" (UniqueName: \"kubernetes.io/projected/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-kube-api-access-pcjsq\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.863747 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-webhook-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.892534 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:34 crc kubenswrapper[4781]: I0227 00:19:34.136852 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj"] Feb 27 00:19:34 crc kubenswrapper[4781]: W0227 00:19:34.138763 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fed4c33_9f3f_486b_8f74_f2d9a09b92be.slice/crio-91e075b25116dfc1152f7bbf22538b3b541d447572a496588f2a3174019e9772 WatchSource:0}: Error finding container 91e075b25116dfc1152f7bbf22538b3b541d447572a496588f2a3174019e9772: Status 404 returned error can't find the container with id 91e075b25116dfc1152f7bbf22538b3b541d447572a496588f2a3174019e9772 Feb 27 00:19:34 crc kubenswrapper[4781]: I0227 00:19:34.412689 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" event={"ID":"1fed4c33-9f3f-486b-8f74-f2d9a09b92be","Type":"ContainerStarted","Data":"91e075b25116dfc1152f7bbf22538b3b541d447572a496588f2a3174019e9772"} Feb 27 00:19:34 crc kubenswrapper[4781]: I0227 00:19:34.614257 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cxq4v" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="registry-server" probeResult="failure" output=< Feb 27 00:19:34 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:19:34 crc kubenswrapper[4781]: > Feb 27 00:19:39 crc kubenswrapper[4781]: I0227 00:19:39.440867 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" event={"ID":"1fed4c33-9f3f-486b-8f74-f2d9a09b92be","Type":"ContainerStarted","Data":"24689dd834be85cbfe91a39df3ce366aa3d20f4672567cd3f86ec05888c16cd8"} Feb 27 00:19:42 crc kubenswrapper[4781]: I0227 00:19:42.895903 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:19:42 crc kubenswrapper[4781]: I0227 00:19:42.896267 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:19:43 crc kubenswrapper[4781]: I0227 00:19:43.575084 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:43 crc kubenswrapper[4781]: I0227 00:19:43.620637 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:44 crc kubenswrapper[4781]: I0227 00:19:44.760562 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxq4v"] Feb 27 00:19:45 crc kubenswrapper[4781]: I0227 00:19:45.479977 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cxq4v" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="registry-server" containerID="cri-o://3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae" gracePeriod=2 Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.077714 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.147059 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-utilities\") pod \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.147123 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2742\" (UniqueName: \"kubernetes.io/projected/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-kube-api-access-v2742\") pod \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.147149 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-catalog-content\") pod \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.148877 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-utilities" (OuterVolumeSpecName: "utilities") pod "9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" (UID: "9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.153046 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-kube-api-access-v2742" (OuterVolumeSpecName: "kube-api-access-v2742") pod "9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" (UID: "9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6"). InnerVolumeSpecName "kube-api-access-v2742". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.248958 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.249015 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2742\" (UniqueName: \"kubernetes.io/projected/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-kube-api-access-v2742\") on node \"crc\" DevicePath \"\"" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.251143 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" (UID: "9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.350374 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.489008 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" event={"ID":"1fed4c33-9f3f-486b-8f74-f2d9a09b92be","Type":"ContainerStarted","Data":"3def8960b32d0635eb7789a6c586b8b43152158956b19377792b52cb099ee428"} Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.489434 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.491528 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.491866 4781 generic.go:334] "Generic (PLEG): container finished" podID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerID="3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae" exitCode=0 Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.491936 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerDied","Data":"3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae"} Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.491980 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerDied","Data":"b3755082d116f31051d75af049033b929644572e35832ae06d2d8defd5e385bf"} Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.492011 4781 scope.go:117] "RemoveContainer" containerID="3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.491940 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.522360 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" podStartSLOduration=1.749427658 podStartE2EDuration="13.522341528s" podCreationTimestamp="2026-02-27 00:19:33 +0000 UTC" firstStartedPulling="2026-02-27 00:19:34.142643429 +0000 UTC m=+843.400182993" lastFinishedPulling="2026-02-27 00:19:45.915557309 +0000 UTC m=+855.173096863" observedRunningTime="2026-02-27 00:19:46.515037262 +0000 UTC m=+855.772576856" watchObservedRunningTime="2026-02-27 00:19:46.522341528 +0000 UTC m=+855.779881092" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.527491 4781 scope.go:117] "RemoveContainer" containerID="fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.541596 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxq4v"] Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.549086 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cxq4v"] Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.563929 4781 scope.go:117] "RemoveContainer" containerID="c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.585023 4781 scope.go:117] "RemoveContainer" containerID="3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae" Feb 27 00:19:46 crc kubenswrapper[4781]: E0227 00:19:46.586311 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae\": container with ID starting with 3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae not found: ID does not exist" containerID="3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.586355 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae"} err="failed to get container status \"3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae\": rpc error: code = NotFound desc = could not find container \"3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae\": container with ID starting with 3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae not found: ID does not exist" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.586383 4781 scope.go:117] "RemoveContainer" containerID="fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28" Feb 27 00:19:46 crc kubenswrapper[4781]: E0227 00:19:46.586934 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28\": container with ID starting with fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28 not found: ID does not exist" containerID="fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.586993 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28"} err="failed to get container status \"fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28\": rpc error: code = NotFound desc = could not find container \"fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28\": container with ID starting with fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28 not found: ID does not exist" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.587029 4781 scope.go:117] "RemoveContainer" containerID="c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a" Feb 27 00:19:46 crc kubenswrapper[4781]: E0227 00:19:46.587479 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a\": container with ID starting with c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a not found: ID does not exist" containerID="c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.587525 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a"} err="failed to get container status \"c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a\": rpc error: code = NotFound desc = could not find container \"c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a\": container with ID starting with c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a not found: ID does not exist" Feb 27 00:19:47 crc kubenswrapper[4781]: I0227 00:19:47.317710 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" path="/var/lib/kubelet/pods/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6/volumes" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.139929 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535860-d6xsb"] Feb 27 00:20:00 crc kubenswrapper[4781]: E0227 00:20:00.140617 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="registry-server" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.140642 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="registry-server" Feb 27 00:20:00 crc kubenswrapper[4781]: E0227 00:20:00.140653 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="extract-utilities" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.140659 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="extract-utilities" Feb 27 00:20:00 crc kubenswrapper[4781]: E0227 00:20:00.140671 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="extract-content" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.140677 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="extract-content" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.140781 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="registry-server" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.141130 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.143669 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.145051 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.145833 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.157792 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535860-d6xsb"] Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.225883 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76h28\" (UniqueName: \"kubernetes.io/projected/c8ba504f-040f-4632-b5d0-4b28aef8d27e-kube-api-access-76h28\") pod \"auto-csr-approver-29535860-d6xsb\" (UID: \"c8ba504f-040f-4632-b5d0-4b28aef8d27e\") " pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.327007 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76h28\" (UniqueName: \"kubernetes.io/projected/c8ba504f-040f-4632-b5d0-4b28aef8d27e-kube-api-access-76h28\") pod \"auto-csr-approver-29535860-d6xsb\" (UID: \"c8ba504f-040f-4632-b5d0-4b28aef8d27e\") " pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.350237 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76h28\" (UniqueName: \"kubernetes.io/projected/c8ba504f-040f-4632-b5d0-4b28aef8d27e-kube-api-access-76h28\") pod \"auto-csr-approver-29535860-d6xsb\" (UID: \"c8ba504f-040f-4632-b5d0-4b28aef8d27e\") " pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.458397 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.912212 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535860-d6xsb"] Feb 27 00:20:01 crc kubenswrapper[4781]: I0227 00:20:01.598034 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" event={"ID":"c8ba504f-040f-4632-b5d0-4b28aef8d27e","Type":"ContainerStarted","Data":"ebbaa7e5cfe4109f4715244e441a89cf7a8b9e24ee4b9e5fe796cdd5c1b58c0b"} Feb 27 00:20:02 crc kubenswrapper[4781]: I0227 00:20:02.606461 4781 generic.go:334] "Generic (PLEG): container finished" podID="c8ba504f-040f-4632-b5d0-4b28aef8d27e" containerID="3eb6fa2c40c5ff8bd90c7472dc3a2b552bb7c38236a559c08d25c903e216a06b" exitCode=0 Feb 27 00:20:02 crc kubenswrapper[4781]: I0227 00:20:02.606579 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" event={"ID":"c8ba504f-040f-4632-b5d0-4b28aef8d27e","Type":"ContainerDied","Data":"3eb6fa2c40c5ff8bd90c7472dc3a2b552bb7c38236a559c08d25c903e216a06b"} Feb 27 00:20:03 crc kubenswrapper[4781]: I0227 00:20:03.913414 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:03 crc kubenswrapper[4781]: I0227 00:20:03.972980 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76h28\" (UniqueName: \"kubernetes.io/projected/c8ba504f-040f-4632-b5d0-4b28aef8d27e-kube-api-access-76h28\") pod \"c8ba504f-040f-4632-b5d0-4b28aef8d27e\" (UID: \"c8ba504f-040f-4632-b5d0-4b28aef8d27e\") " Feb 27 00:20:03 crc kubenswrapper[4781]: I0227 00:20:03.980265 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ba504f-040f-4632-b5d0-4b28aef8d27e-kube-api-access-76h28" (OuterVolumeSpecName: "kube-api-access-76h28") pod "c8ba504f-040f-4632-b5d0-4b28aef8d27e" (UID: "c8ba504f-040f-4632-b5d0-4b28aef8d27e"). InnerVolumeSpecName "kube-api-access-76h28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:20:04 crc kubenswrapper[4781]: I0227 00:20:04.075105 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76h28\" (UniqueName: \"kubernetes.io/projected/c8ba504f-040f-4632-b5d0-4b28aef8d27e-kube-api-access-76h28\") on node \"crc\" DevicePath \"\"" Feb 27 00:20:04 crc kubenswrapper[4781]: I0227 00:20:04.622557 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" event={"ID":"c8ba504f-040f-4632-b5d0-4b28aef8d27e","Type":"ContainerDied","Data":"ebbaa7e5cfe4109f4715244e441a89cf7a8b9e24ee4b9e5fe796cdd5c1b58c0b"} Feb 27 00:20:04 crc kubenswrapper[4781]: I0227 00:20:04.622621 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebbaa7e5cfe4109f4715244e441a89cf7a8b9e24ee4b9e5fe796cdd5c1b58c0b" Feb 27 00:20:04 crc kubenswrapper[4781]: I0227 00:20:04.622733 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:04 crc kubenswrapper[4781]: I0227 00:20:04.974618 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535854-lplm8"] Feb 27 00:20:04 crc kubenswrapper[4781]: I0227 00:20:04.983960 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535854-lplm8"] Feb 27 00:20:05 crc kubenswrapper[4781]: I0227 00:20:05.321685 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2676f22-56e0-46ed-83d0-4d29fc704155" path="/var/lib/kubelet/pods/d2676f22-56e0-46ed-83d0-4d29fc704155/volumes" Feb 27 00:20:12 crc kubenswrapper[4781]: I0227 00:20:12.896093 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:20:12 crc kubenswrapper[4781]: I0227 00:20:12.897010 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.134935 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676"] Feb 27 00:20:18 crc kubenswrapper[4781]: E0227 00:20:18.135676 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ba504f-040f-4632-b5d0-4b28aef8d27e" containerName="oc" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.135690 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ba504f-040f-4632-b5d0-4b28aef8d27e" containerName="oc" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.135817 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ba504f-040f-4632-b5d0-4b28aef8d27e" containerName="oc" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.136515 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.138516 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.144822 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676"] Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.264793 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxr5m\" (UniqueName: \"kubernetes.io/projected/f26f6c49-1028-49bf-9259-4c08b835cfbb-kube-api-access-dxr5m\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.265103 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.265172 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.366681 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.366725 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.366762 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxr5m\" (UniqueName: \"kubernetes.io/projected/f26f6c49-1028-49bf-9259-4c08b835cfbb-kube-api-access-dxr5m\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.367204 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.367204 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.388032 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxr5m\" (UniqueName: \"kubernetes.io/projected/f26f6c49-1028-49bf-9259-4c08b835cfbb-kube-api-access-dxr5m\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.455738 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.668931 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676"] Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.735779 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" event={"ID":"f26f6c49-1028-49bf-9259-4c08b835cfbb","Type":"ContainerStarted","Data":"357800bcda8de6f9e140d7c6d00b143ef325514626de6c9a5edf60efe4d0eda2"} Feb 27 00:20:19 crc kubenswrapper[4781]: I0227 00:20:19.744828 4781 generic.go:334] "Generic (PLEG): container finished" podID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerID="a27f23d6b7b5be18e741a4b8807d607d5da81ed53b1da7f9d1aab51b6d6a45c3" exitCode=0 Feb 27 00:20:19 crc kubenswrapper[4781]: I0227 00:20:19.744900 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" event={"ID":"f26f6c49-1028-49bf-9259-4c08b835cfbb","Type":"ContainerDied","Data":"a27f23d6b7b5be18e741a4b8807d607d5da81ed53b1da7f9d1aab51b6d6a45c3"} Feb 27 00:20:21 crc kubenswrapper[4781]: E0227 00:20:21.406145 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf26f6c49_1028_49bf_9259_4c08b835cfbb.slice/crio-81b733e8736de15432ce35c1b163d861ec3f83f233c9edb6595f87fd26ff80c6.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:20:21 crc kubenswrapper[4781]: I0227 00:20:21.777313 4781 generic.go:334] "Generic (PLEG): container finished" podID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerID="81b733e8736de15432ce35c1b163d861ec3f83f233c9edb6595f87fd26ff80c6" exitCode=0 Feb 27 00:20:21 crc kubenswrapper[4781]: I0227 00:20:21.777398 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" event={"ID":"f26f6c49-1028-49bf-9259-4c08b835cfbb","Type":"ContainerDied","Data":"81b733e8736de15432ce35c1b163d861ec3f83f233c9edb6595f87fd26ff80c6"} Feb 27 00:20:22 crc kubenswrapper[4781]: I0227 00:20:22.783441 4781 generic.go:334] "Generic (PLEG): container finished" podID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerID="7238e183748cbed223e390e70c50dd92d5f55a3b96c7fbb9aa940db829fde41c" exitCode=0 Feb 27 00:20:22 crc kubenswrapper[4781]: I0227 00:20:22.783516 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" event={"ID":"f26f6c49-1028-49bf-9259-4c08b835cfbb","Type":"ContainerDied","Data":"7238e183748cbed223e390e70c50dd92d5f55a3b96c7fbb9aa940db829fde41c"} Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.074857 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.143326 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxr5m\" (UniqueName: \"kubernetes.io/projected/f26f6c49-1028-49bf-9259-4c08b835cfbb-kube-api-access-dxr5m\") pod \"f26f6c49-1028-49bf-9259-4c08b835cfbb\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.143434 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-util\") pod \"f26f6c49-1028-49bf-9259-4c08b835cfbb\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.143480 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-bundle\") pod \"f26f6c49-1028-49bf-9259-4c08b835cfbb\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.144239 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-bundle" (OuterVolumeSpecName: "bundle") pod "f26f6c49-1028-49bf-9259-4c08b835cfbb" (UID: "f26f6c49-1028-49bf-9259-4c08b835cfbb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.148550 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26f6c49-1028-49bf-9259-4c08b835cfbb-kube-api-access-dxr5m" (OuterVolumeSpecName: "kube-api-access-dxr5m") pod "f26f6c49-1028-49bf-9259-4c08b835cfbb" (UID: "f26f6c49-1028-49bf-9259-4c08b835cfbb"). InnerVolumeSpecName "kube-api-access-dxr5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.162346 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-util" (OuterVolumeSpecName: "util") pod "f26f6c49-1028-49bf-9259-4c08b835cfbb" (UID: "f26f6c49-1028-49bf-9259-4c08b835cfbb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.245657 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxr5m\" (UniqueName: \"kubernetes.io/projected/f26f6c49-1028-49bf-9259-4c08b835cfbb-kube-api-access-dxr5m\") on node \"crc\" DevicePath \"\"" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.245703 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-util\") on node \"crc\" DevicePath \"\"" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.245725 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.808279 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" event={"ID":"f26f6c49-1028-49bf-9259-4c08b835cfbb","Type":"ContainerDied","Data":"357800bcda8de6f9e140d7c6d00b143ef325514626de6c9a5edf60efe4d0eda2"} Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.808350 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="357800bcda8de6f9e140d7c6d00b143ef325514626de6c9a5edf60efe4d0eda2" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.808356 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.737896 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs"] Feb 27 00:20:27 crc kubenswrapper[4781]: E0227 00:20:27.738509 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="extract" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.738525 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="extract" Feb 27 00:20:27 crc kubenswrapper[4781]: E0227 00:20:27.738547 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="pull" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.738554 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="pull" Feb 27 00:20:27 crc kubenswrapper[4781]: E0227 00:20:27.738570 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="util" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.738579 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="util" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.738717 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="extract" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.739204 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.741835 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-2vq77" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.741972 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.742415 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.797174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkwqv\" (UniqueName: \"kubernetes.io/projected/e948619f-a0f4-4463-9076-e593529e4264-kube-api-access-wkwqv\") pod \"nmstate-operator-75c5dccd6c-m8kqs\" (UID: \"e948619f-a0f4-4463-9076-e593529e4264\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.833149 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs"] Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.898591 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkwqv\" (UniqueName: \"kubernetes.io/projected/e948619f-a0f4-4463-9076-e593529e4264-kube-api-access-wkwqv\") pod \"nmstate-operator-75c5dccd6c-m8kqs\" (UID: \"e948619f-a0f4-4463-9076-e593529e4264\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.928067 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkwqv\" (UniqueName: \"kubernetes.io/projected/e948619f-a0f4-4463-9076-e593529e4264-kube-api-access-wkwqv\") pod \"nmstate-operator-75c5dccd6c-m8kqs\" (UID: \"e948619f-a0f4-4463-9076-e593529e4264\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" Feb 27 00:20:28 crc kubenswrapper[4781]: I0227 00:20:28.055649 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" Feb 27 00:20:28 crc kubenswrapper[4781]: I0227 00:20:28.299823 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs"] Feb 27 00:20:28 crc kubenswrapper[4781]: I0227 00:20:28.831202 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" event={"ID":"e948619f-a0f4-4463-9076-e593529e4264","Type":"ContainerStarted","Data":"2b36192aa6f618f46a1e4679c7a5e3283e805c87fab274e79464a81048ebf24a"} Feb 27 00:20:31 crc kubenswrapper[4781]: I0227 00:20:31.864601 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" event={"ID":"e948619f-a0f4-4463-9076-e593529e4264","Type":"ContainerStarted","Data":"39e4b0f10e19f7bb1761553b73ecde98df0a78be2400cf4f599095a526b8a0d7"} Feb 27 00:20:31 crc kubenswrapper[4781]: I0227 00:20:31.892149 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" podStartSLOduration=2.07249484 podStartE2EDuration="4.892120922s" podCreationTimestamp="2026-02-27 00:20:27 +0000 UTC" firstStartedPulling="2026-02-27 00:20:28.307236449 +0000 UTC m=+897.564776003" lastFinishedPulling="2026-02-27 00:20:31.126862531 +0000 UTC m=+900.384402085" observedRunningTime="2026-02-27 00:20:31.885241407 +0000 UTC m=+901.142780981" watchObservedRunningTime="2026-02-27 00:20:31.892120922 +0000 UTC m=+901.149660506" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.898092 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-4d4ds"] Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.898906 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.901523 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bq8xv" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.915655 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p"] Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.916544 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.919010 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.935523 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p"] Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.953210 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-r6fjq"] Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.963317 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmh9w\" (UniqueName: \"kubernetes.io/projected/677ca1f7-513f-4de1-b64b-66b2524b82a1-kube-api-access-cmh9w\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.963413 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbqrm\" (UniqueName: \"kubernetes.io/projected/2b001223-04cf-4a45-843b-e62c5d13ac14-kube-api-access-xbqrm\") pod \"nmstate-metrics-69594cc75-4d4ds\" (UID: \"2b001223-04cf-4a45-843b-e62c5d13ac14\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.963458 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/677ca1f7-513f-4de1-b64b-66b2524b82a1-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.971672 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.988021 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-4d4ds"] Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.041858 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp"] Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.042559 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.045941 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.045947 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-hfnqh" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.047375 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.052609 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp"] Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.064873 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/677ca1f7-513f-4de1-b64b-66b2524b82a1-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.064948 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-nmstate-lock\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.064986 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-dbus-socket\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.065039 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gcdq\" (UniqueName: \"kubernetes.io/projected/f7bf5593-bd4f-462d-bcbf-319b075a5116-kube-api-access-8gcdq\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.065094 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmh9w\" (UniqueName: \"kubernetes.io/projected/677ca1f7-513f-4de1-b64b-66b2524b82a1-kube-api-access-cmh9w\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:33 crc kubenswrapper[4781]: E0227 00:20:33.065098 4781 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.065147 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-ovs-socket\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: E0227 00:20:33.065165 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/677ca1f7-513f-4de1-b64b-66b2524b82a1-tls-key-pair podName:677ca1f7-513f-4de1-b64b-66b2524b82a1 nodeName:}" failed. No retries permitted until 2026-02-27 00:20:33.565146018 +0000 UTC m=+902.822685572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/677ca1f7-513f-4de1-b64b-66b2524b82a1-tls-key-pair") pod "nmstate-webhook-786f45cff4-vrv7p" (UID: "677ca1f7-513f-4de1-b64b-66b2524b82a1") : secret "openshift-nmstate-webhook" not found Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.065180 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbqrm\" (UniqueName: \"kubernetes.io/projected/2b001223-04cf-4a45-843b-e62c5d13ac14-kube-api-access-xbqrm\") pod \"nmstate-metrics-69594cc75-4d4ds\" (UID: \"2b001223-04cf-4a45-843b-e62c5d13ac14\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.095580 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmh9w\" (UniqueName: \"kubernetes.io/projected/677ca1f7-513f-4de1-b64b-66b2524b82a1-kube-api-access-cmh9w\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.098158 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbqrm\" (UniqueName: \"kubernetes.io/projected/2b001223-04cf-4a45-843b-e62c5d13ac14-kube-api-access-xbqrm\") pod \"nmstate-metrics-69594cc75-4d4ds\" (UID: \"2b001223-04cf-4a45-843b-e62c5d13ac14\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166615 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-ovs-socket\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166705 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-nmstate-lock\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166728 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-dbus-socket\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166751 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqnhf\" (UniqueName: \"kubernetes.io/projected/fcd8e350-64e3-4a25-9bc5-cce4888da20a-kube-api-access-wqnhf\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166776 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcd8e350-64e3-4a25-9bc5-cce4888da20a-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166804 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gcdq\" (UniqueName: \"kubernetes.io/projected/f7bf5593-bd4f-462d-bcbf-319b075a5116-kube-api-access-8gcdq\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166829 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcd8e350-64e3-4a25-9bc5-cce4888da20a-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166912 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-ovs-socket\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166941 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-nmstate-lock\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.167173 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-dbus-socket\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.201278 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gcdq\" (UniqueName: \"kubernetes.io/projected/f7bf5593-bd4f-462d-bcbf-319b075a5116-kube-api-access-8gcdq\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.213867 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.268264 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcd8e350-64e3-4a25-9bc5-cce4888da20a-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.268897 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcd8e350-64e3-4a25-9bc5-cce4888da20a-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.269050 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqnhf\" (UniqueName: \"kubernetes.io/projected/fcd8e350-64e3-4a25-9bc5-cce4888da20a-kube-api-access-wqnhf\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.270170 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcd8e350-64e3-4a25-9bc5-cce4888da20a-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.274217 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcd8e350-64e3-4a25-9bc5-cce4888da20a-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.278751 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f69fbfd98-lv4mj"] Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.279545 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.296966 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.297409 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqnhf\" (UniqueName: \"kubernetes.io/projected/fcd8e350-64e3-4a25-9bc5-cce4888da20a-kube-api-access-wqnhf\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.297110 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f69fbfd98-lv4mj"] Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.354993 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373417 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-serving-cert\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373465 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggl6g\" (UniqueName: \"kubernetes.io/projected/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-kube-api-access-ggl6g\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373489 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-oauth-config\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373511 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-config\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373537 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-service-ca\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373564 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-oauth-serving-cert\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373581 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-trusted-ca-bundle\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.474876 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-serving-cert\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.474915 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggl6g\" (UniqueName: \"kubernetes.io/projected/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-kube-api-access-ggl6g\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.474939 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-oauth-config\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.474960 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-config\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.474988 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-service-ca\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.475020 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-oauth-serving-cert\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.475037 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-trusted-ca-bundle\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.476224 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-config\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.476701 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-oauth-serving-cert\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.476912 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-service-ca\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.477114 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-trusted-ca-bundle\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.480617 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-serving-cert\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.480996 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-oauth-config\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.484856 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-4d4ds"] Feb 27 00:20:33 crc kubenswrapper[4781]: W0227 00:20:33.488231 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b001223_04cf_4a45_843b_e62c5d13ac14.slice/crio-b92dec761f36b95ef2e423133c495dc82e43679e764f384170f4576e434e96a9 WatchSource:0}: Error finding container b92dec761f36b95ef2e423133c495dc82e43679e764f384170f4576e434e96a9: Status 404 returned error can't find the container with id b92dec761f36b95ef2e423133c495dc82e43679e764f384170f4576e434e96a9 Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.492443 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggl6g\" (UniqueName: \"kubernetes.io/projected/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-kube-api-access-ggl6g\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.545508 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp"] Feb 27 00:20:33 crc kubenswrapper[4781]: W0227 00:20:33.556570 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcd8e350_64e3_4a25_9bc5_cce4888da20a.slice/crio-489cc6a2ba3df41687d3dbe738b2af0698b97d99c29e1c4e619a36d0c2ee734b WatchSource:0}: Error finding container 489cc6a2ba3df41687d3dbe738b2af0698b97d99c29e1c4e619a36d0c2ee734b: Status 404 returned error can't find the container with id 489cc6a2ba3df41687d3dbe738b2af0698b97d99c29e1c4e619a36d0c2ee734b Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.576406 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/677ca1f7-513f-4de1-b64b-66b2524b82a1-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.579433 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/677ca1f7-513f-4de1-b64b-66b2524b82a1-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.608453 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.829506 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.877778 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" event={"ID":"2b001223-04cf-4a45-843b-e62c5d13ac14","Type":"ContainerStarted","Data":"b92dec761f36b95ef2e423133c495dc82e43679e764f384170f4576e434e96a9"} Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.878511 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" event={"ID":"fcd8e350-64e3-4a25-9bc5-cce4888da20a","Type":"ContainerStarted","Data":"489cc6a2ba3df41687d3dbe738b2af0698b97d99c29e1c4e619a36d0c2ee734b"} Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.879317 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-r6fjq" event={"ID":"f7bf5593-bd4f-462d-bcbf-319b075a5116","Type":"ContainerStarted","Data":"8ce1d847320d4b7a095d0231ec7bfe36b0b9de7e180d1647a4c7fd21430bb6da"} Feb 27 00:20:34 crc kubenswrapper[4781]: I0227 00:20:34.036975 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p"] Feb 27 00:20:34 crc kubenswrapper[4781]: W0227 00:20:34.040579 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod677ca1f7_513f_4de1_b64b_66b2524b82a1.slice/crio-4efe8519c899784ddd182c7c70d959a95caac65f6d28bb883fe155c5d9cfba5a WatchSource:0}: Error finding container 4efe8519c899784ddd182c7c70d959a95caac65f6d28bb883fe155c5d9cfba5a: Status 404 returned error can't find the container with id 4efe8519c899784ddd182c7c70d959a95caac65f6d28bb883fe155c5d9cfba5a Feb 27 00:20:34 crc kubenswrapper[4781]: W0227 00:20:34.041501 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8078e41d_b0b1_4f24_8d32_cd44b315f3e6.slice/crio-5160ef6ee4b95fd7e750137dc965b4d1c348383a01e2671c904b79e580408d0f WatchSource:0}: Error finding container 5160ef6ee4b95fd7e750137dc965b4d1c348383a01e2671c904b79e580408d0f: Status 404 returned error can't find the container with id 5160ef6ee4b95fd7e750137dc965b4d1c348383a01e2671c904b79e580408d0f Feb 27 00:20:34 crc kubenswrapper[4781]: I0227 00:20:34.043350 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f69fbfd98-lv4mj"] Feb 27 00:20:34 crc kubenswrapper[4781]: I0227 00:20:34.887714 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f69fbfd98-lv4mj" event={"ID":"8078e41d-b0b1-4f24-8d32-cd44b315f3e6","Type":"ContainerStarted","Data":"f4fca42a15c68ede53a37518e14c977fbab2bebb7b36c506d72befc891c19f4e"} Feb 27 00:20:34 crc kubenswrapper[4781]: I0227 00:20:34.888129 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f69fbfd98-lv4mj" event={"ID":"8078e41d-b0b1-4f24-8d32-cd44b315f3e6","Type":"ContainerStarted","Data":"5160ef6ee4b95fd7e750137dc965b4d1c348383a01e2671c904b79e580408d0f"} Feb 27 00:20:34 crc kubenswrapper[4781]: I0227 00:20:34.889198 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" event={"ID":"677ca1f7-513f-4de1-b64b-66b2524b82a1","Type":"ContainerStarted","Data":"4efe8519c899784ddd182c7c70d959a95caac65f6d28bb883fe155c5d9cfba5a"} Feb 27 00:20:34 crc kubenswrapper[4781]: I0227 00:20:34.915653 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f69fbfd98-lv4mj" podStartSLOduration=1.9156116600000002 podStartE2EDuration="1.91561166s" podCreationTimestamp="2026-02-27 00:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:20:34.911926716 +0000 UTC m=+904.169466280" watchObservedRunningTime="2026-02-27 00:20:34.91561166 +0000 UTC m=+904.173151234" Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.906344 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" event={"ID":"677ca1f7-513f-4de1-b64b-66b2524b82a1","Type":"ContainerStarted","Data":"b24d00327247c18c40436086c4b92c8ada38f79188f28be675370d9b57271c6d"} Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.906749 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.908076 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" event={"ID":"2b001223-04cf-4a45-843b-e62c5d13ac14","Type":"ContainerStarted","Data":"be9c6004dce2452f11b70ded74b4f9f553504775e50eaaf43d4aec4d14d75912"} Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.909807 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" event={"ID":"fcd8e350-64e3-4a25-9bc5-cce4888da20a","Type":"ContainerStarted","Data":"540e9e59ee52822899bd9318ccf25217de77c679afe2d9b6ecca061bb64062c4"} Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.911584 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-r6fjq" event={"ID":"f7bf5593-bd4f-462d-bcbf-319b075a5116","Type":"ContainerStarted","Data":"7807686042fa2e86cafbd22dec2c45194cb7970593eea7565633b00292dd3860"} Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.911677 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.921598 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" podStartSLOduration=2.43966489 podStartE2EDuration="5.921583263s" podCreationTimestamp="2026-02-27 00:20:32 +0000 UTC" firstStartedPulling="2026-02-27 00:20:34.044054954 +0000 UTC m=+903.301594518" lastFinishedPulling="2026-02-27 00:20:37.525973337 +0000 UTC m=+906.783512891" observedRunningTime="2026-02-27 00:20:37.920601868 +0000 UTC m=+907.178141422" watchObservedRunningTime="2026-02-27 00:20:37.921583263 +0000 UTC m=+907.179122827" Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.937366 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-r6fjq" podStartSLOduration=1.771807468 podStartE2EDuration="5.937351014s" podCreationTimestamp="2026-02-27 00:20:32 +0000 UTC" firstStartedPulling="2026-02-27 00:20:33.354289695 +0000 UTC m=+902.611829249" lastFinishedPulling="2026-02-27 00:20:37.519833251 +0000 UTC m=+906.777372795" observedRunningTime="2026-02-27 00:20:37.936054501 +0000 UTC m=+907.193594075" watchObservedRunningTime="2026-02-27 00:20:37.937351014 +0000 UTC m=+907.194890568" Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.957840 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" podStartSLOduration=1.014247806 podStartE2EDuration="4.957825415s" podCreationTimestamp="2026-02-27 00:20:33 +0000 UTC" firstStartedPulling="2026-02-27 00:20:33.558304445 +0000 UTC m=+902.815843999" lastFinishedPulling="2026-02-27 00:20:37.501882054 +0000 UTC m=+906.759421608" observedRunningTime="2026-02-27 00:20:37.957360093 +0000 UTC m=+907.214899647" watchObservedRunningTime="2026-02-27 00:20:37.957825415 +0000 UTC m=+907.215364969" Feb 27 00:20:40 crc kubenswrapper[4781]: I0227 00:20:40.935726 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" event={"ID":"2b001223-04cf-4a45-843b-e62c5d13ac14","Type":"ContainerStarted","Data":"af3d0961a88bd8aa4970ea6c83db478faadeb698c987a49aa383d178c8476d52"} Feb 27 00:20:40 crc kubenswrapper[4781]: I0227 00:20:40.953097 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" podStartSLOduration=2.201980102 podStartE2EDuration="8.953074485s" podCreationTimestamp="2026-02-27 00:20:32 +0000 UTC" firstStartedPulling="2026-02-27 00:20:33.496059351 +0000 UTC m=+902.753598905" lastFinishedPulling="2026-02-27 00:20:40.247153734 +0000 UTC m=+909.504693288" observedRunningTime="2026-02-27 00:20:40.951756132 +0000 UTC m=+910.209295726" watchObservedRunningTime="2026-02-27 00:20:40.953074485 +0000 UTC m=+910.210614079" Feb 27 00:20:42 crc kubenswrapper[4781]: I0227 00:20:42.896026 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:20:42 crc kubenswrapper[4781]: I0227 00:20:42.896398 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:20:42 crc kubenswrapper[4781]: I0227 00:20:42.896460 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:20:42 crc kubenswrapper[4781]: I0227 00:20:42.897302 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a4838ae34a31bed19fe04c8cb77eb7ca161a34e4d168445bf5a5f93e91a959a"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:20:42 crc kubenswrapper[4781]: I0227 00:20:42.897396 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://4a4838ae34a31bed19fe04c8cb77eb7ca161a34e4d168445bf5a5f93e91a959a" gracePeriod=600 Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.322572 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.609711 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.610110 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.617347 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.971239 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="4a4838ae34a31bed19fe04c8cb77eb7ca161a34e4d168445bf5a5f93e91a959a" exitCode=0 Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.971319 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"4a4838ae34a31bed19fe04c8cb77eb7ca161a34e4d168445bf5a5f93e91a959a"} Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.971472 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"58cd249b96a5284dbe453e012e30bb3f9acbc9ed9b891c6e44075d418edc5ad9"} Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.971512 4781 scope.go:117] "RemoveContainer" containerID="98d9908780c17a21a4a701f7c994bde3e3fbb6ea911f1b4e11c3a27ce7db4d1d" Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.984272 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:44 crc kubenswrapper[4781]: I0227 00:20:44.052003 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vtsxv"] Feb 27 00:20:48 crc kubenswrapper[4781]: I0227 00:20:48.967659 4781 scope.go:117] "RemoveContainer" containerID="7ea50ff483bc5e473c8ac4484b625c2d3aca274594f654dad11472e0c517581a" Feb 27 00:20:53 crc kubenswrapper[4781]: I0227 00:20:53.839473 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.414484 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv"] Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.416317 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.418015 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.421020 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv"] Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.504744 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.505094 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.505311 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-788sk\" (UniqueName: \"kubernetes.io/projected/2112f4cb-1229-4856-b3ec-a882e6fba5a6-kube-api-access-788sk\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.606767 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.607037 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.607151 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-788sk\" (UniqueName: \"kubernetes.io/projected/2112f4cb-1229-4856-b3ec-a882e6fba5a6-kube-api-access-788sk\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.607428 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.607438 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.628983 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-788sk\" (UniqueName: \"kubernetes.io/projected/2112f4cb-1229-4856-b3ec-a882e6fba5a6-kube-api-access-788sk\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.730139 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.993451 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv"] Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.094316 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-vtsxv" podUID="76705148-274c-4428-9508-13fe1193646e" containerName="console" containerID="cri-o://ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d" gracePeriod=15 Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.169719 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" event={"ID":"2112f4cb-1229-4856-b3ec-a882e6fba5a6","Type":"ContainerStarted","Data":"0ddb319720490bd564d819eecab4446118317263411f03b09128eef25f9ffd33"} Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.170054 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" event={"ID":"2112f4cb-1229-4856-b3ec-a882e6fba5a6","Type":"ContainerStarted","Data":"a030052635f8b40a9e51e27e5e1179137be1981e6c214ce14624ac6e58bdb42f"} Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.371437 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vtsxv_76705148-274c-4428-9508-13fe1193646e/console/0.log" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.371499 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416292 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-serving-cert\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416405 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-service-ca\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416436 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-oauth-serving-cert\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416453 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-trusted-ca-bundle\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416475 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-console-config\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416524 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-oauth-config\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416549 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjdbq\" (UniqueName: \"kubernetes.io/projected/76705148-274c-4428-9508-13fe1193646e-kube-api-access-xjdbq\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.417245 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.417259 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-console-config" (OuterVolumeSpecName: "console-config") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.417280 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.417292 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-service-ca" (OuterVolumeSpecName: "service-ca") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.421474 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.421723 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.422313 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76705148-274c-4428-9508-13fe1193646e-kube-api-access-xjdbq" (OuterVolumeSpecName: "kube-api-access-xjdbq") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "kube-api-access-xjdbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518237 4781 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518273 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518282 4781 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518312 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518321 4781 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-console-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518328 4781 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518336 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjdbq\" (UniqueName: \"kubernetes.io/projected/76705148-274c-4428-9508-13fe1193646e-kube-api-access-xjdbq\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.178204 4781 generic.go:334] "Generic (PLEG): container finished" podID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerID="0ddb319720490bd564d819eecab4446118317263411f03b09128eef25f9ffd33" exitCode=0 Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.178375 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" event={"ID":"2112f4cb-1229-4856-b3ec-a882e6fba5a6","Type":"ContainerDied","Data":"0ddb319720490bd564d819eecab4446118317263411f03b09128eef25f9ffd33"} Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.180255 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.181672 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vtsxv_76705148-274c-4428-9508-13fe1193646e/console/0.log" Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.181702 4781 generic.go:334] "Generic (PLEG): container finished" podID="76705148-274c-4428-9508-13fe1193646e" containerID="ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d" exitCode=2 Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.181728 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vtsxv" event={"ID":"76705148-274c-4428-9508-13fe1193646e","Type":"ContainerDied","Data":"ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d"} Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.181751 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vtsxv" event={"ID":"76705148-274c-4428-9508-13fe1193646e","Type":"ContainerDied","Data":"52e8848cb853a0dc3b72ab7abe99678676a1a3484d971d2212d9dc7e0814de5c"} Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.181768 4781 scope.go:117] "RemoveContainer" containerID="ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d" Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.181854 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.219128 4781 scope.go:117] "RemoveContainer" containerID="ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d" Feb 27 00:21:10 crc kubenswrapper[4781]: E0227 00:21:10.219938 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d\": container with ID starting with ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d not found: ID does not exist" containerID="ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d" Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.219975 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d"} err="failed to get container status \"ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d\": rpc error: code = NotFound desc = could not find container \"ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d\": container with ID starting with ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d not found: ID does not exist" Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.236241 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vtsxv"] Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.240475 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-vtsxv"] Feb 27 00:21:11 crc kubenswrapper[4781]: I0227 00:21:11.316054 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76705148-274c-4428-9508-13fe1193646e" path="/var/lib/kubelet/pods/76705148-274c-4428-9508-13fe1193646e/volumes" Feb 27 00:21:12 crc kubenswrapper[4781]: E0227 00:21:12.137261 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2112f4cb_1229_4856_b3ec_a882e6fba5a6.slice/crio-adb86b91ec03d7d8d5d9703cdb25703d1f3fd14c5f579236b7989c975ab0b1d4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2112f4cb_1229_4856_b3ec_a882e6fba5a6.slice/crio-conmon-adb86b91ec03d7d8d5d9703cdb25703d1f3fd14c5f579236b7989c975ab0b1d4.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:21:12 crc kubenswrapper[4781]: I0227 00:21:12.201405 4781 generic.go:334] "Generic (PLEG): container finished" podID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerID="adb86b91ec03d7d8d5d9703cdb25703d1f3fd14c5f579236b7989c975ab0b1d4" exitCode=0 Feb 27 00:21:12 crc kubenswrapper[4781]: I0227 00:21:12.201489 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" event={"ID":"2112f4cb-1229-4856-b3ec-a882e6fba5a6","Type":"ContainerDied","Data":"adb86b91ec03d7d8d5d9703cdb25703d1f3fd14c5f579236b7989c975ab0b1d4"} Feb 27 00:21:13 crc kubenswrapper[4781]: I0227 00:21:13.211223 4781 generic.go:334] "Generic (PLEG): container finished" podID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerID="95d543c78d543c8b4990d35fd5e9ec57547440a0e5ce353d2b81c5ee11526c3c" exitCode=0 Feb 27 00:21:13 crc kubenswrapper[4781]: I0227 00:21:13.211339 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" event={"ID":"2112f4cb-1229-4856-b3ec-a882e6fba5a6","Type":"ContainerDied","Data":"95d543c78d543c8b4990d35fd5e9ec57547440a0e5ce353d2b81c5ee11526c3c"} Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.494857 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.585413 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-util\") pod \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.585461 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-bundle\") pod \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.585486 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-788sk\" (UniqueName: \"kubernetes.io/projected/2112f4cb-1229-4856-b3ec-a882e6fba5a6-kube-api-access-788sk\") pod \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.586604 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-bundle" (OuterVolumeSpecName: "bundle") pod "2112f4cb-1229-4856-b3ec-a882e6fba5a6" (UID: "2112f4cb-1229-4856-b3ec-a882e6fba5a6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.590108 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2112f4cb-1229-4856-b3ec-a882e6fba5a6-kube-api-access-788sk" (OuterVolumeSpecName: "kube-api-access-788sk") pod "2112f4cb-1229-4856-b3ec-a882e6fba5a6" (UID: "2112f4cb-1229-4856-b3ec-a882e6fba5a6"). InnerVolumeSpecName "kube-api-access-788sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.686957 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.686986 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-788sk\" (UniqueName: \"kubernetes.io/projected/2112f4cb-1229-4856-b3ec-a882e6fba5a6-kube-api-access-788sk\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.795852 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-util" (OuterVolumeSpecName: "util") pod "2112f4cb-1229-4856-b3ec-a882e6fba5a6" (UID: "2112f4cb-1229-4856-b3ec-a882e6fba5a6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.890622 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-util\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:15 crc kubenswrapper[4781]: I0227 00:21:15.228172 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" event={"ID":"2112f4cb-1229-4856-b3ec-a882e6fba5a6","Type":"ContainerDied","Data":"a030052635f8b40a9e51e27e5e1179137be1981e6c214ce14624ac6e58bdb42f"} Feb 27 00:21:15 crc kubenswrapper[4781]: I0227 00:21:15.228231 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a030052635f8b40a9e51e27e5e1179137be1981e6c214ce14624ac6e58bdb42f" Feb 27 00:21:15 crc kubenswrapper[4781]: I0227 00:21:15.228288 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910071 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk"] Feb 27 00:21:22 crc kubenswrapper[4781]: E0227 00:21:22.910704 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="extract" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910715 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="extract" Feb 27 00:21:22 crc kubenswrapper[4781]: E0227 00:21:22.910732 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="pull" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910738 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="pull" Feb 27 00:21:22 crc kubenswrapper[4781]: E0227 00:21:22.910750 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76705148-274c-4428-9508-13fe1193646e" containerName="console" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910758 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="76705148-274c-4428-9508-13fe1193646e" containerName="console" Feb 27 00:21:22 crc kubenswrapper[4781]: E0227 00:21:22.910773 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="util" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910779 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="util" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910873 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="76705148-274c-4428-9508-13fe1193646e" containerName="console" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910886 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="extract" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.911242 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.913025 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.913488 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.914004 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.914322 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.915056 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ggqth" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.928990 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk"] Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.014311 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7020f39f-9738-4625-bd18-e5e4e64f5956-apiservice-cert\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.014368 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7020f39f-9738-4625-bd18-e5e4e64f5956-webhook-cert\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.014429 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjzsx\" (UniqueName: \"kubernetes.io/projected/7020f39f-9738-4625-bd18-e5e4e64f5956-kube-api-access-fjzsx\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.115221 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7020f39f-9738-4625-bd18-e5e4e64f5956-apiservice-cert\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.115277 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7020f39f-9738-4625-bd18-e5e4e64f5956-webhook-cert\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.115322 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjzsx\" (UniqueName: \"kubernetes.io/projected/7020f39f-9738-4625-bd18-e5e4e64f5956-kube-api-access-fjzsx\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.137484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7020f39f-9738-4625-bd18-e5e4e64f5956-apiservice-cert\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.137486 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7020f39f-9738-4625-bd18-e5e4e64f5956-webhook-cert\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.139916 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjzsx\" (UniqueName: \"kubernetes.io/projected/7020f39f-9738-4625-bd18-e5e4e64f5956-kube-api-access-fjzsx\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.145127 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v"] Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.146007 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.150650 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-wfgd6" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.150881 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.151422 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.157265 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v"] Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.216159 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc2d6f99-bd3f-44e8-91fc-6865285089e7-webhook-cert\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.216231 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc2d6f99-bd3f-44e8-91fc-6865285089e7-apiservice-cert\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.216453 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9sxk\" (UniqueName: \"kubernetes.io/projected/fc2d6f99-bd3f-44e8-91fc-6865285089e7-kube-api-access-r9sxk\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.227023 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.317256 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9sxk\" (UniqueName: \"kubernetes.io/projected/fc2d6f99-bd3f-44e8-91fc-6865285089e7-kube-api-access-r9sxk\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.317495 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc2d6f99-bd3f-44e8-91fc-6865285089e7-webhook-cert\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.317532 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc2d6f99-bd3f-44e8-91fc-6865285089e7-apiservice-cert\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.324817 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc2d6f99-bd3f-44e8-91fc-6865285089e7-apiservice-cert\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.334878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc2d6f99-bd3f-44e8-91fc-6865285089e7-webhook-cert\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.335277 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9sxk\" (UniqueName: \"kubernetes.io/projected/fc2d6f99-bd3f-44e8-91fc-6865285089e7-kube-api-access-r9sxk\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.472918 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.668890 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk"] Feb 27 00:21:23 crc kubenswrapper[4781]: W0227 00:21:23.683172 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7020f39f_9738_4625_bd18_e5e4e64f5956.slice/crio-774de6d7742a7d040ae61feaa4484837b3d9ed78423d8cd297d214bf8bdf3172 WatchSource:0}: Error finding container 774de6d7742a7d040ae61feaa4484837b3d9ed78423d8cd297d214bf8bdf3172: Status 404 returned error can't find the container with id 774de6d7742a7d040ae61feaa4484837b3d9ed78423d8cd297d214bf8bdf3172 Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.780058 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v"] Feb 27 00:21:23 crc kubenswrapper[4781]: W0227 00:21:23.781367 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc2d6f99_bd3f_44e8_91fc_6865285089e7.slice/crio-09876a7185cb41db89ce204b51999b41bbfe3bf4efacd45611c321f31d8985f7 WatchSource:0}: Error finding container 09876a7185cb41db89ce204b51999b41bbfe3bf4efacd45611c321f31d8985f7: Status 404 returned error can't find the container with id 09876a7185cb41db89ce204b51999b41bbfe3bf4efacd45611c321f31d8985f7 Feb 27 00:21:24 crc kubenswrapper[4781]: I0227 00:21:24.662610 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" event={"ID":"fc2d6f99-bd3f-44e8-91fc-6865285089e7","Type":"ContainerStarted","Data":"09876a7185cb41db89ce204b51999b41bbfe3bf4efacd45611c321f31d8985f7"} Feb 27 00:21:24 crc kubenswrapper[4781]: I0227 00:21:24.664171 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" event={"ID":"7020f39f-9738-4625-bd18-e5e4e64f5956","Type":"ContainerStarted","Data":"774de6d7742a7d040ae61feaa4484837b3d9ed78423d8cd297d214bf8bdf3172"} Feb 27 00:21:29 crc kubenswrapper[4781]: I0227 00:21:29.697917 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" event={"ID":"fc2d6f99-bd3f-44e8-91fc-6865285089e7","Type":"ContainerStarted","Data":"e4bc153d14b08884f0e2c27fe60ed230bdb4b4f20cac1e6f22273c11caf74a2d"} Feb 27 00:21:29 crc kubenswrapper[4781]: I0227 00:21:29.698705 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:29 crc kubenswrapper[4781]: I0227 00:21:29.705066 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" event={"ID":"7020f39f-9738-4625-bd18-e5e4e64f5956","Type":"ContainerStarted","Data":"ad28ac70dc1334490f5a6920c6bb8dc290a7c8df00df805e4638c4994f1eb331"} Feb 27 00:21:29 crc kubenswrapper[4781]: I0227 00:21:29.705477 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:29 crc kubenswrapper[4781]: I0227 00:21:29.746547 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" podStartSLOduration=1.183455584 podStartE2EDuration="6.74652652s" podCreationTimestamp="2026-02-27 00:21:23 +0000 UTC" firstStartedPulling="2026-02-27 00:21:23.784705098 +0000 UTC m=+953.042244652" lastFinishedPulling="2026-02-27 00:21:29.347776034 +0000 UTC m=+958.605315588" observedRunningTime="2026-02-27 00:21:29.739115951 +0000 UTC m=+958.996655525" watchObservedRunningTime="2026-02-27 00:21:29.74652652 +0000 UTC m=+959.004066074" Feb 27 00:21:29 crc kubenswrapper[4781]: I0227 00:21:29.759712 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" podStartSLOduration=2.115557466 podStartE2EDuration="7.759691164s" podCreationTimestamp="2026-02-27 00:21:22 +0000 UTC" firstStartedPulling="2026-02-27 00:21:23.689299941 +0000 UTC m=+952.946839495" lastFinishedPulling="2026-02-27 00:21:29.333433639 +0000 UTC m=+958.590973193" observedRunningTime="2026-02-27 00:21:29.758227867 +0000 UTC m=+959.015767501" watchObservedRunningTime="2026-02-27 00:21:29.759691164 +0000 UTC m=+959.017230718" Feb 27 00:21:43 crc kubenswrapper[4781]: I0227 00:21:43.477899 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.129646 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535862-l9vc5"] Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.130951 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.133099 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.133197 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.133278 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.140603 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535862-l9vc5"] Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.225812 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmtbc\" (UniqueName: \"kubernetes.io/projected/411dc0f9-584c-453b-a137-189ab8731570-kube-api-access-jmtbc\") pod \"auto-csr-approver-29535862-l9vc5\" (UID: \"411dc0f9-584c-453b-a137-189ab8731570\") " pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.326573 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmtbc\" (UniqueName: \"kubernetes.io/projected/411dc0f9-584c-453b-a137-189ab8731570-kube-api-access-jmtbc\") pod \"auto-csr-approver-29535862-l9vc5\" (UID: \"411dc0f9-584c-453b-a137-189ab8731570\") " pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.345423 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmtbc\" (UniqueName: \"kubernetes.io/projected/411dc0f9-584c-453b-a137-189ab8731570-kube-api-access-jmtbc\") pod \"auto-csr-approver-29535862-l9vc5\" (UID: \"411dc0f9-584c-453b-a137-189ab8731570\") " pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.450169 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.866900 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535862-l9vc5"] Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.931577 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" event={"ID":"411dc0f9-584c-453b-a137-189ab8731570","Type":"ContainerStarted","Data":"a3bf3c618ee2624440aeefa3c46dec475d48ce699c73b107c9ab38efd57223c1"} Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.303333 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tcfvp"] Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.305737 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.329115 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tcfvp"] Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.458002 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvznh\" (UniqueName: \"kubernetes.io/projected/d292517f-9d33-4590-beae-e0810b1395fa-kube-api-access-vvznh\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.458050 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-catalog-content\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.458140 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-utilities\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.559755 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvznh\" (UniqueName: \"kubernetes.io/projected/d292517f-9d33-4590-beae-e0810b1395fa-kube-api-access-vvznh\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.560382 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-catalog-content\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.560888 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-catalog-content\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.561100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-utilities\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.561535 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-utilities\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.578584 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvznh\" (UniqueName: \"kubernetes.io/projected/d292517f-9d33-4590-beae-e0810b1395fa-kube-api-access-vvznh\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.624431 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.944281 4781 generic.go:334] "Generic (PLEG): container finished" podID="411dc0f9-584c-453b-a137-189ab8731570" containerID="576df563fec491fe4b88b02b86a929d4019c459ebde0d69bbe30c74025de222c" exitCode=0 Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.944438 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" event={"ID":"411dc0f9-584c-453b-a137-189ab8731570","Type":"ContainerDied","Data":"576df563fec491fe4b88b02b86a929d4019c459ebde0d69bbe30c74025de222c"} Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.137593 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tcfvp"] Feb 27 00:22:03 crc kubenswrapper[4781]: W0227 00:22:03.141967 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd292517f_9d33_4590_beae_e0810b1395fa.slice/crio-dc30d05e6e59ba685bb93b7eed25462156f94fb18da11a6113fc7887bdfc1c16 WatchSource:0}: Error finding container dc30d05e6e59ba685bb93b7eed25462156f94fb18da11a6113fc7887bdfc1c16: Status 404 returned error can't find the container with id dc30d05e6e59ba685bb93b7eed25462156f94fb18da11a6113fc7887bdfc1c16 Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.229418 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.955115 4781 generic.go:334] "Generic (PLEG): container finished" podID="d292517f-9d33-4590-beae-e0810b1395fa" containerID="3f48d77562b78b7c4eb9406b5bae70799989c1e6ce32c4c8dfce12eef51c7679" exitCode=0 Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.955225 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerDied","Data":"3f48d77562b78b7c4eb9406b5bae70799989c1e6ce32c4c8dfce12eef51c7679"} Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.955481 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerStarted","Data":"dc30d05e6e59ba685bb93b7eed25462156f94fb18da11a6113fc7887bdfc1c16"} Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.968207 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-j2n85"] Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.971406 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.977384 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.977590 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.977609 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-kk2xt" Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.989355 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx"] Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.990279 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.997469 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx"] Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.000236 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.063657 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tljmv"] Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.064579 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.072698 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.072808 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.073466 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-jlhm7" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.073586 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087193 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6cpn\" (UniqueName: \"kubernetes.io/projected/43006307-3a88-4e83-b57f-965df4bd043d-kube-api-access-c6cpn\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/43006307-3a88-4e83-b57f-965df4bd043d-frr-startup\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087277 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31409f77-5542-4376-8d77-c7a018b245b7-cert\") pod \"frr-k8s-webhook-server-7f989f654f-cqkgx\" (UID: \"31409f77-5542-4376-8d77-c7a018b245b7\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087330 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-metrics\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087362 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43006307-3a88-4e83-b57f-965df4bd043d-metrics-certs\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4k97\" (UniqueName: \"kubernetes.io/projected/31409f77-5542-4376-8d77-c7a018b245b7-kube-api-access-r4k97\") pod \"frr-k8s-webhook-server-7f989f654f-cqkgx\" (UID: \"31409f77-5542-4376-8d77-c7a018b245b7\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087422 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-frr-conf\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087452 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-frr-sockets\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-reloader\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.090679 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-c6m2v"] Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.092005 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.096806 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.111525 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-c6m2v"] Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.197409 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc6f679c-913d-4851-b69d-a2e26ebf450a-metrics-certs\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198340 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-metrics\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198374 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43006307-3a88-4e83-b57f-965df4bd043d-metrics-certs\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4k97\" (UniqueName: \"kubernetes.io/projected/31409f77-5542-4376-8d77-c7a018b245b7-kube-api-access-r4k97\") pod \"frr-k8s-webhook-server-7f989f654f-cqkgx\" (UID: \"31409f77-5542-4376-8d77-c7a018b245b7\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198463 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198512 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-frr-conf\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198537 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6f679c-913d-4851-b69d-a2e26ebf450a-cert\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198558 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-frr-sockets\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198602 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-reloader\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198662 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-metrics-certs\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198704 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf5fg\" (UniqueName: \"kubernetes.io/projected/dc6f679c-913d-4851-b69d-a2e26ebf450a-kube-api-access-qf5fg\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198750 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-metrics\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198775 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vx4c\" (UniqueName: \"kubernetes.io/projected/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-kube-api-access-5vx4c\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198878 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6cpn\" (UniqueName: \"kubernetes.io/projected/43006307-3a88-4e83-b57f-965df4bd043d-kube-api-access-c6cpn\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198903 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-metallb-excludel2\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198926 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/43006307-3a88-4e83-b57f-965df4bd043d-frr-startup\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198953 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31409f77-5542-4376-8d77-c7a018b245b7-cert\") pod \"frr-k8s-webhook-server-7f989f654f-cqkgx\" (UID: \"31409f77-5542-4376-8d77-c7a018b245b7\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.200586 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-frr-conf\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.201497 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/43006307-3a88-4e83-b57f-965df4bd043d-frr-startup\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.201820 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-reloader\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.202325 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-frr-sockets\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.207134 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43006307-3a88-4e83-b57f-965df4bd043d-metrics-certs\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.218351 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6cpn\" (UniqueName: \"kubernetes.io/projected/43006307-3a88-4e83-b57f-965df4bd043d-kube-api-access-c6cpn\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.220478 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4k97\" (UniqueName: \"kubernetes.io/projected/31409f77-5542-4376-8d77-c7a018b245b7-kube-api-access-r4k97\") pod \"frr-k8s-webhook-server-7f989f654f-cqkgx\" (UID: \"31409f77-5542-4376-8d77-c7a018b245b7\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.226850 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31409f77-5542-4376-8d77-c7a018b245b7-cert\") pod \"frr-k8s-webhook-server-7f989f654f-cqkgx\" (UID: \"31409f77-5542-4376-8d77-c7a018b245b7\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.287329 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.293977 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300501 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-metallb-excludel2\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300568 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc6f679c-913d-4851-b69d-a2e26ebf450a-metrics-certs\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300607 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300691 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6f679c-913d-4851-b69d-a2e26ebf450a-cert\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300726 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-metrics-certs\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300767 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf5fg\" (UniqueName: \"kubernetes.io/projected/dc6f679c-913d-4851-b69d-a2e26ebf450a-kube-api-access-qf5fg\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300799 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vx4c\" (UniqueName: \"kubernetes.io/projected/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-kube-api-access-5vx4c\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: E0227 00:22:04.300797 4781 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 27 00:22:04 crc kubenswrapper[4781]: E0227 00:22:04.300892 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist podName:5d7e20ea-c069-4c29-9c3f-1ac3404f026c nodeName:}" failed. No retries permitted until 2026-02-27 00:22:04.800871751 +0000 UTC m=+994.058411305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist") pod "speaker-tljmv" (UID: "5d7e20ea-c069-4c29-9c3f-1ac3404f026c") : secret "metallb-memberlist" not found Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.301568 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-metallb-excludel2\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.303669 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.304878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-metrics-certs\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.305399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc6f679c-913d-4851-b69d-a2e26ebf450a-metrics-certs\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.312580 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.318945 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6f679c-913d-4851-b69d-a2e26ebf450a-cert\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.320668 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vx4c\" (UniqueName: \"kubernetes.io/projected/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-kube-api-access-5vx4c\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.323387 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf5fg\" (UniqueName: \"kubernetes.io/projected/dc6f679c-913d-4851-b69d-a2e26ebf450a-kube-api-access-qf5fg\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.402060 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmtbc\" (UniqueName: \"kubernetes.io/projected/411dc0f9-584c-453b-a137-189ab8731570-kube-api-access-jmtbc\") pod \"411dc0f9-584c-453b-a137-189ab8731570\" (UID: \"411dc0f9-584c-453b-a137-189ab8731570\") " Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.406081 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/411dc0f9-584c-453b-a137-189ab8731570-kube-api-access-jmtbc" (OuterVolumeSpecName: "kube-api-access-jmtbc") pod "411dc0f9-584c-453b-a137-189ab8731570" (UID: "411dc0f9-584c-453b-a137-189ab8731570"). InnerVolumeSpecName "kube-api-access-jmtbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.413006 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.503656 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmtbc\" (UniqueName: \"kubernetes.io/projected/411dc0f9-584c-453b-a137-189ab8731570-kube-api-access-jmtbc\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.645942 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-c6m2v"] Feb 27 00:22:04 crc kubenswrapper[4781]: W0227 00:22:04.657049 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc6f679c_913d_4851_b69d_a2e26ebf450a.slice/crio-a16333594994dc88f16db6f7a76d6fff70645922ee4b477ccd2ef2ef339da335 WatchSource:0}: Error finding container a16333594994dc88f16db6f7a76d6fff70645922ee4b477ccd2ef2ef339da335: Status 404 returned error can't find the container with id a16333594994dc88f16db6f7a76d6fff70645922ee4b477ccd2ef2ef339da335 Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.754574 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx"] Feb 27 00:22:04 crc kubenswrapper[4781]: W0227 00:22:04.759880 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31409f77_5542_4376_8d77_c7a018b245b7.slice/crio-162d067f06ee8ce69c09b9a544e249ca9c866a8714079288e2bbeaaa36bf057d WatchSource:0}: Error finding container 162d067f06ee8ce69c09b9a544e249ca9c866a8714079288e2bbeaaa36bf057d: Status 404 returned error can't find the container with id 162d067f06ee8ce69c09b9a544e249ca9c866a8714079288e2bbeaaa36bf057d Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.808211 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: E0227 00:22:04.808408 4781 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 27 00:22:04 crc kubenswrapper[4781]: E0227 00:22:04.808826 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist podName:5d7e20ea-c069-4c29-9c3f-1ac3404f026c nodeName:}" failed. No retries permitted until 2026-02-27 00:22:05.808799695 +0000 UTC m=+995.066339249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist") pod "speaker-tljmv" (UID: "5d7e20ea-c069-4c29-9c3f-1ac3404f026c") : secret "metallb-memberlist" not found Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.963260 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerStarted","Data":"fbb42f4e19a7208fb759ce636689c17c2b7bc5e5f12afafb5c11f053e74607ac"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.964707 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" event={"ID":"31409f77-5542-4376-8d77-c7a018b245b7","Type":"ContainerStarted","Data":"162d067f06ee8ce69c09b9a544e249ca9c866a8714079288e2bbeaaa36bf057d"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.966247 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" event={"ID":"411dc0f9-584c-453b-a137-189ab8731570","Type":"ContainerDied","Data":"a3bf3c618ee2624440aeefa3c46dec475d48ce699c73b107c9ab38efd57223c1"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.966291 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3bf3c618ee2624440aeefa3c46dec475d48ce699c73b107c9ab38efd57223c1" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.966312 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.972587 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"f908361cb936292eea5614d03642ac94b248356b5031911f9f7dce351a864876"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.973743 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-c6m2v" event={"ID":"dc6f679c-913d-4851-b69d-a2e26ebf450a","Type":"ContainerStarted","Data":"0ae361d3d561edda2bf7a30eef558fd7d3b7a1392f13785b2fa632ebb62dc787"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.973771 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-c6m2v" event={"ID":"dc6f679c-913d-4851-b69d-a2e26ebf450a","Type":"ContainerStarted","Data":"fae3bf950a751276dea979e227c7699208eb922107b0f38fc5f75c0f692d9879"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.973781 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-c6m2v" event={"ID":"dc6f679c-913d-4851-b69d-a2e26ebf450a","Type":"ContainerStarted","Data":"a16333594994dc88f16db6f7a76d6fff70645922ee4b477ccd2ef2ef339da335"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.974381 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.007432 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-c6m2v" podStartSLOduration=1.007415136 podStartE2EDuration="1.007415136s" podCreationTimestamp="2026-02-27 00:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:22:05.005686181 +0000 UTC m=+994.263225745" watchObservedRunningTime="2026-02-27 00:22:05.007415136 +0000 UTC m=+994.264954690" Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.335575 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535856-mznwl"] Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.339437 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535856-mznwl"] Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.819980 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.825859 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.898597 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tljmv" Feb 27 00:22:05 crc kubenswrapper[4781]: W0227 00:22:05.934215 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d7e20ea_c069_4c29_9c3f_1ac3404f026c.slice/crio-0fdbf8c09378b719500e9ccdad3108a428c9aad7a3d7ef576fe7190bcf78487d WatchSource:0}: Error finding container 0fdbf8c09378b719500e9ccdad3108a428c9aad7a3d7ef576fe7190bcf78487d: Status 404 returned error can't find the container with id 0fdbf8c09378b719500e9ccdad3108a428c9aad7a3d7ef576fe7190bcf78487d Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.983255 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tljmv" event={"ID":"5d7e20ea-c069-4c29-9c3f-1ac3404f026c","Type":"ContainerStarted","Data":"0fdbf8c09378b719500e9ccdad3108a428c9aad7a3d7ef576fe7190bcf78487d"} Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.986949 4781 generic.go:334] "Generic (PLEG): container finished" podID="d292517f-9d33-4590-beae-e0810b1395fa" containerID="fbb42f4e19a7208fb759ce636689c17c2b7bc5e5f12afafb5c11f053e74607ac" exitCode=0 Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.987002 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerDied","Data":"fbb42f4e19a7208fb759ce636689c17c2b7bc5e5f12afafb5c11f053e74607ac"} Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:06.998877 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tljmv" event={"ID":"5d7e20ea-c069-4c29-9c3f-1ac3404f026c","Type":"ContainerStarted","Data":"1adaf3191383877a7fb7daba3e2b41a4a95c8c9c9e58d88bdc661ccf7ec02d62"} Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:06.999175 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tljmv" event={"ID":"5d7e20ea-c069-4c29-9c3f-1ac3404f026c","Type":"ContainerStarted","Data":"cc53f5a28d5cd6ff49f502e5ea0f9d76e4d5127e689b200232c8930f8cc03f26"} Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:07.000002 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tljmv" Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:07.009189 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerStarted","Data":"d2674464663697b7cd845ec2becb3f87c0311300123fbaaf0825ff61fcdf9b0a"} Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:07.020385 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tljmv" podStartSLOduration=3.020367204 podStartE2EDuration="3.020367204s" podCreationTimestamp="2026-02-27 00:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:22:07.017456388 +0000 UTC m=+996.274995942" watchObservedRunningTime="2026-02-27 00:22:07.020367204 +0000 UTC m=+996.277906758" Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:07.040672 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tcfvp" podStartSLOduration=2.6171347750000002 podStartE2EDuration="5.040655654s" podCreationTimestamp="2026-02-27 00:22:02 +0000 UTC" firstStartedPulling="2026-02-27 00:22:03.957775444 +0000 UTC m=+993.215315018" lastFinishedPulling="2026-02-27 00:22:06.381296343 +0000 UTC m=+995.638835897" observedRunningTime="2026-02-27 00:22:07.039872034 +0000 UTC m=+996.297411578" watchObservedRunningTime="2026-02-27 00:22:07.040655654 +0000 UTC m=+996.298195208" Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:07.319398 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="778d83b2-2e0c-45b3-a296-aaba355c6427" path="/var/lib/kubelet/pods/778d83b2-2e0c-45b3-a296-aaba355c6427/volumes" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.089474 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k995x"] Feb 27 00:22:09 crc kubenswrapper[4781]: E0227 00:22:09.090057 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411dc0f9-584c-453b-a137-189ab8731570" containerName="oc" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.090074 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="411dc0f9-584c-453b-a137-189ab8731570" containerName="oc" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.090214 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="411dc0f9-584c-453b-a137-189ab8731570" containerName="oc" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.091212 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.106287 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k995x"] Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.265129 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jvpz\" (UniqueName: \"kubernetes.io/projected/586212ca-1380-4fea-a2f1-105fc30f56e3-kube-api-access-4jvpz\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.265235 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-catalog-content\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.265303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-utilities\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.366901 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-utilities\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.366958 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jvpz\" (UniqueName: \"kubernetes.io/projected/586212ca-1380-4fea-a2f1-105fc30f56e3-kube-api-access-4jvpz\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.367010 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-catalog-content\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.367521 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-catalog-content\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.367524 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-utilities\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.394391 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jvpz\" (UniqueName: \"kubernetes.io/projected/586212ca-1380-4fea-a2f1-105fc30f56e3-kube-api-access-4jvpz\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.418510 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.714090 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k995x"] Feb 27 00:22:10 crc kubenswrapper[4781]: I0227 00:22:10.044061 4781 generic.go:334] "Generic (PLEG): container finished" podID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerID="54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed" exitCode=0 Feb 27 00:22:10 crc kubenswrapper[4781]: I0227 00:22:10.044250 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerDied","Data":"54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed"} Feb 27 00:22:10 crc kubenswrapper[4781]: I0227 00:22:10.044390 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerStarted","Data":"08252891894ad181fdad10de247b6b64df19ad9963bf55e86249173eed72e1a9"} Feb 27 00:22:11 crc kubenswrapper[4781]: I0227 00:22:11.059175 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerStarted","Data":"bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6"} Feb 27 00:22:12 crc kubenswrapper[4781]: I0227 00:22:12.066443 4781 generic.go:334] "Generic (PLEG): container finished" podID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerID="bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6" exitCode=0 Feb 27 00:22:12 crc kubenswrapper[4781]: I0227 00:22:12.067212 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerDied","Data":"bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6"} Feb 27 00:22:12 crc kubenswrapper[4781]: I0227 00:22:12.624763 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:12 crc kubenswrapper[4781]: I0227 00:22:12.625064 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:12 crc kubenswrapper[4781]: I0227 00:22:12.668965 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:13 crc kubenswrapper[4781]: I0227 00:22:13.126608 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:14 crc kubenswrapper[4781]: I0227 00:22:14.419771 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.092979 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerStarted","Data":"3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98"} Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.094902 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" event={"ID":"31409f77-5542-4376-8d77-c7a018b245b7","Type":"ContainerStarted","Data":"9479c6a31a508ff24ffdf215a0552d8503f7f28a59e569d367d9eca78442dc30"} Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.095007 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.098171 4781 generic.go:334] "Generic (PLEG): container finished" podID="43006307-3a88-4e83-b57f-965df4bd043d" containerID="7d1a97274b38bf99a1ed86190d4f1d99c2588ba1ecd252177f8d8bb041cc8621" exitCode=0 Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.098309 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerDied","Data":"7d1a97274b38bf99a1ed86190d4f1d99c2588ba1ecd252177f8d8bb041cc8621"} Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.117149 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k995x" podStartSLOduration=1.83005375 podStartE2EDuration="6.117122111s" podCreationTimestamp="2026-02-27 00:22:09 +0000 UTC" firstStartedPulling="2026-02-27 00:22:10.047005615 +0000 UTC m=+999.304545169" lastFinishedPulling="2026-02-27 00:22:14.334073976 +0000 UTC m=+1003.591613530" observedRunningTime="2026-02-27 00:22:15.111012131 +0000 UTC m=+1004.368551725" watchObservedRunningTime="2026-02-27 00:22:15.117122111 +0000 UTC m=+1004.374661695" Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.131316 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" podStartSLOduration=2.809616932 podStartE2EDuration="12.131296612s" podCreationTimestamp="2026-02-27 00:22:03 +0000 UTC" firstStartedPulling="2026-02-27 00:22:04.762024593 +0000 UTC m=+994.019564147" lastFinishedPulling="2026-02-27 00:22:14.083704273 +0000 UTC m=+1003.341243827" observedRunningTime="2026-02-27 00:22:15.127115432 +0000 UTC m=+1004.384654996" watchObservedRunningTime="2026-02-27 00:22:15.131296612 +0000 UTC m=+1004.388836186" Feb 27 00:22:16 crc kubenswrapper[4781]: I0227 00:22:16.107763 4781 generic.go:334] "Generic (PLEG): container finished" podID="43006307-3a88-4e83-b57f-965df4bd043d" containerID="6b7b7dab1857c3218b173c7757535f1d9dc87f018d0d0499b560386781f0d9cd" exitCode=0 Feb 27 00:22:16 crc kubenswrapper[4781]: I0227 00:22:16.107821 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerDied","Data":"6b7b7dab1857c3218b173c7757535f1d9dc87f018d0d0499b560386781f0d9cd"} Feb 27 00:22:16 crc kubenswrapper[4781]: I0227 00:22:16.286204 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tcfvp"] Feb 27 00:22:16 crc kubenswrapper[4781]: I0227 00:22:16.286447 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tcfvp" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="registry-server" containerID="cri-o://d2674464663697b7cd845ec2becb3f87c0311300123fbaaf0825ff61fcdf9b0a" gracePeriod=2 Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.121766 4781 generic.go:334] "Generic (PLEG): container finished" podID="d292517f-9d33-4590-beae-e0810b1395fa" containerID="d2674464663697b7cd845ec2becb3f87c0311300123fbaaf0825ff61fcdf9b0a" exitCode=0 Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.121824 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerDied","Data":"d2674464663697b7cd845ec2becb3f87c0311300123fbaaf0825ff61fcdf9b0a"} Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.125231 4781 generic.go:334] "Generic (PLEG): container finished" podID="43006307-3a88-4e83-b57f-965df4bd043d" containerID="49c494c22b05665d2a24ac9ebba393c2d522d32048b7482eb856415ae96d68de" exitCode=0 Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.125274 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerDied","Data":"49c494c22b05665d2a24ac9ebba393c2d522d32048b7482eb856415ae96d68de"} Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.232803 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.388572 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvznh\" (UniqueName: \"kubernetes.io/projected/d292517f-9d33-4590-beae-e0810b1395fa-kube-api-access-vvznh\") pod \"d292517f-9d33-4590-beae-e0810b1395fa\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.388707 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-utilities\") pod \"d292517f-9d33-4590-beae-e0810b1395fa\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.388781 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-catalog-content\") pod \"d292517f-9d33-4590-beae-e0810b1395fa\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.389648 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-utilities" (OuterVolumeSpecName: "utilities") pod "d292517f-9d33-4590-beae-e0810b1395fa" (UID: "d292517f-9d33-4590-beae-e0810b1395fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.390866 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.393551 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d292517f-9d33-4590-beae-e0810b1395fa-kube-api-access-vvznh" (OuterVolumeSpecName: "kube-api-access-vvznh") pod "d292517f-9d33-4590-beae-e0810b1395fa" (UID: "d292517f-9d33-4590-beae-e0810b1395fa"). InnerVolumeSpecName "kube-api-access-vvznh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.444485 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d292517f-9d33-4590-beae-e0810b1395fa" (UID: "d292517f-9d33-4590-beae-e0810b1395fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.492580 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.492614 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvznh\" (UniqueName: \"kubernetes.io/projected/d292517f-9d33-4590-beae-e0810b1395fa-kube-api-access-vvznh\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.139650 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerDied","Data":"dc30d05e6e59ba685bb93b7eed25462156f94fb18da11a6113fc7887bdfc1c16"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.139903 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.140041 4781 scope.go:117] "RemoveContainer" containerID="d2674464663697b7cd845ec2becb3f87c0311300123fbaaf0825ff61fcdf9b0a" Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145021 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"1d11fbc8a61aed2b70be8a828981dba857b2c5fbe461635c23f7a2280364a7db"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145075 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"7ade9677e2a5a2798e3aec6259047a7e1fbde86fb818094e3dd5ae03401a632e"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145093 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"d1e73566326443b55e16e9d978011a7707f5d6e1564e3c3885cba67b8851e546"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145113 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"f16864687f756288835edd62a7293c94dff4d226ec5d0cdeeb0921a0f32f390a"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145128 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"72be0a56ed19ea154914cc7c6eaec94dceae7b58953d578ffd4f3168750b2fbc"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145143 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"818f97d6403f9ac22e319204b84e8821cec8a69a3658ea16a5bd1db2eda8b677"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145273 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.168245 4781 scope.go:117] "RemoveContainer" containerID="fbb42f4e19a7208fb759ce636689c17c2b7bc5e5f12afafb5c11f053e74607ac" Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.200709 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-j2n85" podStartSLOduration=5.640312921 podStartE2EDuration="15.200688029s" podCreationTimestamp="2026-02-27 00:22:03 +0000 UTC" firstStartedPulling="2026-02-27 00:22:04.524180947 +0000 UTC m=+993.781720501" lastFinishedPulling="2026-02-27 00:22:14.084556065 +0000 UTC m=+1003.342095609" observedRunningTime="2026-02-27 00:22:18.181814836 +0000 UTC m=+1007.439354420" watchObservedRunningTime="2026-02-27 00:22:18.200688029 +0000 UTC m=+1007.458227593" Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.203353 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tcfvp"] Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.221602 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tcfvp"] Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.227501 4781 scope.go:117] "RemoveContainer" containerID="3f48d77562b78b7c4eb9406b5bae70799989c1e6ce32c4c8dfce12eef51c7679" Feb 27 00:22:19 crc kubenswrapper[4781]: I0227 00:22:19.294760 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:19 crc kubenswrapper[4781]: I0227 00:22:19.319434 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d292517f-9d33-4590-beae-e0810b1395fa" path="/var/lib/kubelet/pods/d292517f-9d33-4590-beae-e0810b1395fa/volumes" Feb 27 00:22:19 crc kubenswrapper[4781]: I0227 00:22:19.333684 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:19 crc kubenswrapper[4781]: I0227 00:22:19.418871 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:19 crc kubenswrapper[4781]: I0227 00:22:19.418925 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:19 crc kubenswrapper[4781]: I0227 00:22:19.458845 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:20 crc kubenswrapper[4781]: I0227 00:22:20.212263 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:21 crc kubenswrapper[4781]: I0227 00:22:21.482281 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k995x"] Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.182577 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k995x" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="registry-server" containerID="cri-o://3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98" gracePeriod=2 Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.611485 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.764747 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-catalog-content\") pod \"586212ca-1380-4fea-a2f1-105fc30f56e3\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.764828 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jvpz\" (UniqueName: \"kubernetes.io/projected/586212ca-1380-4fea-a2f1-105fc30f56e3-kube-api-access-4jvpz\") pod \"586212ca-1380-4fea-a2f1-105fc30f56e3\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.764890 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-utilities\") pod \"586212ca-1380-4fea-a2f1-105fc30f56e3\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.766266 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-utilities" (OuterVolumeSpecName: "utilities") pod "586212ca-1380-4fea-a2f1-105fc30f56e3" (UID: "586212ca-1380-4fea-a2f1-105fc30f56e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.789862 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586212ca-1380-4fea-a2f1-105fc30f56e3-kube-api-access-4jvpz" (OuterVolumeSpecName: "kube-api-access-4jvpz") pod "586212ca-1380-4fea-a2f1-105fc30f56e3" (UID: "586212ca-1380-4fea-a2f1-105fc30f56e3"). InnerVolumeSpecName "kube-api-access-4jvpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.828615 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "586212ca-1380-4fea-a2f1-105fc30f56e3" (UID: "586212ca-1380-4fea-a2f1-105fc30f56e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.866654 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.866691 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jvpz\" (UniqueName: \"kubernetes.io/projected/586212ca-1380-4fea-a2f1-105fc30f56e3-kube-api-access-4jvpz\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.866701 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.193130 4781 generic.go:334] "Generic (PLEG): container finished" podID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerID="3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98" exitCode=0 Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.193178 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerDied","Data":"3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98"} Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.193205 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerDied","Data":"08252891894ad181fdad10de247b6b64df19ad9963bf55e86249173eed72e1a9"} Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.193221 4781 scope.go:117] "RemoveContainer" containerID="3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.193402 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.229169 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k995x"] Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.229243 4781 scope.go:117] "RemoveContainer" containerID="bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.234545 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k995x"] Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.247965 4781 scope.go:117] "RemoveContainer" containerID="54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.277350 4781 scope.go:117] "RemoveContainer" containerID="3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98" Feb 27 00:22:23 crc kubenswrapper[4781]: E0227 00:22:23.277756 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98\": container with ID starting with 3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98 not found: ID does not exist" containerID="3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.277808 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98"} err="failed to get container status \"3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98\": rpc error: code = NotFound desc = could not find container \"3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98\": container with ID starting with 3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98 not found: ID does not exist" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.277830 4781 scope.go:117] "RemoveContainer" containerID="bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6" Feb 27 00:22:23 crc kubenswrapper[4781]: E0227 00:22:23.278084 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6\": container with ID starting with bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6 not found: ID does not exist" containerID="bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.278104 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6"} err="failed to get container status \"bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6\": rpc error: code = NotFound desc = could not find container \"bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6\": container with ID starting with bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6 not found: ID does not exist" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.278137 4781 scope.go:117] "RemoveContainer" containerID="54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed" Feb 27 00:22:23 crc kubenswrapper[4781]: E0227 00:22:23.278499 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed\": container with ID starting with 54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed not found: ID does not exist" containerID="54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.278539 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed"} err="failed to get container status \"54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed\": rpc error: code = NotFound desc = could not find container \"54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed\": container with ID starting with 54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed not found: ID does not exist" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.317377 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" path="/var/lib/kubelet/pods/586212ca-1380-4fea-a2f1-105fc30f56e3/volumes" Feb 27 00:22:24 crc kubenswrapper[4781]: I0227 00:22:24.317957 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:25 crc kubenswrapper[4781]: I0227 00:22:25.903552 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tljmv" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.091690 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rrx6z"] Feb 27 00:22:32 crc kubenswrapper[4781]: E0227 00:22:32.092187 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="registry-server" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092198 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="registry-server" Feb 27 00:22:32 crc kubenswrapper[4781]: E0227 00:22:32.092218 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="extract-content" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092224 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="extract-content" Feb 27 00:22:32 crc kubenswrapper[4781]: E0227 00:22:32.092236 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="extract-utilities" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092242 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="extract-utilities" Feb 27 00:22:32 crc kubenswrapper[4781]: E0227 00:22:32.092252 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="extract-content" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092257 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="extract-content" Feb 27 00:22:32 crc kubenswrapper[4781]: E0227 00:22:32.092268 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="registry-server" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092273 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="registry-server" Feb 27 00:22:32 crc kubenswrapper[4781]: E0227 00:22:32.092283 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="extract-utilities" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092290 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="extract-utilities" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092396 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="registry-server" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092404 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="registry-server" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.093414 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.095838 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-s7w9m" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.095981 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.096264 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.109558 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rrx6z"] Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.193275 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6hkr\" (UniqueName: \"kubernetes.io/projected/f66c974d-5687-42bd-9742-469922240fd5-kube-api-access-q6hkr\") pod \"openstack-operator-index-rrx6z\" (UID: \"f66c974d-5687-42bd-9742-469922240fd5\") " pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.295061 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6hkr\" (UniqueName: \"kubernetes.io/projected/f66c974d-5687-42bd-9742-469922240fd5-kube-api-access-q6hkr\") pod \"openstack-operator-index-rrx6z\" (UID: \"f66c974d-5687-42bd-9742-469922240fd5\") " pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.313784 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6hkr\" (UniqueName: \"kubernetes.io/projected/f66c974d-5687-42bd-9742-469922240fd5-kube-api-access-q6hkr\") pod \"openstack-operator-index-rrx6z\" (UID: \"f66c974d-5687-42bd-9742-469922240fd5\") " pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.424338 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.817377 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rrx6z"] Feb 27 00:22:33 crc kubenswrapper[4781]: I0227 00:22:33.268383 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rrx6z" event={"ID":"f66c974d-5687-42bd-9742-469922240fd5","Type":"ContainerStarted","Data":"233e724f10dca6bc8119540049620e5e753edba61bd421691e91b4d2fb68526b"} Feb 27 00:22:34 crc kubenswrapper[4781]: I0227 00:22:34.303599 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:35 crc kubenswrapper[4781]: I0227 00:22:35.285253 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rrx6z" event={"ID":"f66c974d-5687-42bd-9742-469922240fd5","Type":"ContainerStarted","Data":"6d247caa220fa7adc12fb8d3b113153200f3ad9a6c3899aa94ab78b37af649ff"} Feb 27 00:22:35 crc kubenswrapper[4781]: I0227 00:22:35.311002 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rrx6z" podStartSLOduration=1.082876801 podStartE2EDuration="3.310975763s" podCreationTimestamp="2026-02-27 00:22:32 +0000 UTC" firstStartedPulling="2026-02-27 00:22:32.829242593 +0000 UTC m=+1022.086782147" lastFinishedPulling="2026-02-27 00:22:35.057341555 +0000 UTC m=+1024.314881109" observedRunningTime="2026-02-27 00:22:35.302604834 +0000 UTC m=+1024.560144418" watchObservedRunningTime="2026-02-27 00:22:35.310975763 +0000 UTC m=+1024.568515347" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.695067 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpbl"] Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.702370 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.709048 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpbl"] Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.794504 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9p6p\" (UniqueName: \"kubernetes.io/projected/2595ff69-b633-443a-81ca-238982513cf4-kube-api-access-d9p6p\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.794566 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-catalog-content\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.794675 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-utilities\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.895939 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9p6p\" (UniqueName: \"kubernetes.io/projected/2595ff69-b633-443a-81ca-238982513cf4-kube-api-access-d9p6p\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.895995 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-catalog-content\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.896054 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-utilities\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.896532 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-utilities\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.896553 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-catalog-content\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.917684 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9p6p\" (UniqueName: \"kubernetes.io/projected/2595ff69-b633-443a-81ca-238982513cf4-kube-api-access-d9p6p\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:38 crc kubenswrapper[4781]: I0227 00:22:38.020427 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:38 crc kubenswrapper[4781]: I0227 00:22:38.531199 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpbl"] Feb 27 00:22:38 crc kubenswrapper[4781]: W0227 00:22:38.542606 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2595ff69_b633_443a_81ca_238982513cf4.slice/crio-d510acce79081bd256d287a8b249b7c90204000cf94cd34fe8e254caa90ba14b WatchSource:0}: Error finding container d510acce79081bd256d287a8b249b7c90204000cf94cd34fe8e254caa90ba14b: Status 404 returned error can't find the container with id d510acce79081bd256d287a8b249b7c90204000cf94cd34fe8e254caa90ba14b Feb 27 00:22:39 crc kubenswrapper[4781]: I0227 00:22:39.312052 4781 generic.go:334] "Generic (PLEG): container finished" podID="2595ff69-b633-443a-81ca-238982513cf4" containerID="72146b21e6a598f0d19679595e182fcde5a67855f58589e196415993f2a3b7f4" exitCode=0 Feb 27 00:22:39 crc kubenswrapper[4781]: I0227 00:22:39.316323 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpbl" event={"ID":"2595ff69-b633-443a-81ca-238982513cf4","Type":"ContainerDied","Data":"72146b21e6a598f0d19679595e182fcde5a67855f58589e196415993f2a3b7f4"} Feb 27 00:22:39 crc kubenswrapper[4781]: I0227 00:22:39.316353 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpbl" event={"ID":"2595ff69-b633-443a-81ca-238982513cf4","Type":"ContainerStarted","Data":"d510acce79081bd256d287a8b249b7c90204000cf94cd34fe8e254caa90ba14b"} Feb 27 00:22:40 crc kubenswrapper[4781]: I0227 00:22:40.320559 4781 generic.go:334] "Generic (PLEG): container finished" podID="2595ff69-b633-443a-81ca-238982513cf4" containerID="3c9631be4e58fde1db46b99f19a1e6807b18aacae49000c04cc5603d62ba18fa" exitCode=0 Feb 27 00:22:40 crc kubenswrapper[4781]: I0227 00:22:40.320602 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpbl" event={"ID":"2595ff69-b633-443a-81ca-238982513cf4","Type":"ContainerDied","Data":"3c9631be4e58fde1db46b99f19a1e6807b18aacae49000c04cc5603d62ba18fa"} Feb 27 00:22:41 crc kubenswrapper[4781]: I0227 00:22:41.328702 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpbl" event={"ID":"2595ff69-b633-443a-81ca-238982513cf4","Type":"ContainerStarted","Data":"4e7d81148b9a631f02e9b59a7880d37ff51e7b7fc6595218d4abbd94b9c5c429"} Feb 27 00:22:41 crc kubenswrapper[4781]: I0227 00:22:41.342097 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xhpbl" podStartSLOduration=2.8702360000000002 podStartE2EDuration="4.342079107s" podCreationTimestamp="2026-02-27 00:22:37 +0000 UTC" firstStartedPulling="2026-02-27 00:22:39.314166327 +0000 UTC m=+1028.571705891" lastFinishedPulling="2026-02-27 00:22:40.786009444 +0000 UTC m=+1030.043548998" observedRunningTime="2026-02-27 00:22:41.342053436 +0000 UTC m=+1030.599592990" watchObservedRunningTime="2026-02-27 00:22:41.342079107 +0000 UTC m=+1030.599618651" Feb 27 00:22:42 crc kubenswrapper[4781]: I0227 00:22:42.425181 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:42 crc kubenswrapper[4781]: I0227 00:22:42.425233 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:42 crc kubenswrapper[4781]: I0227 00:22:42.451490 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:43 crc kubenswrapper[4781]: I0227 00:22:43.369349 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.545386 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b"] Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.547951 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.557612 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-xcdg7" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.557714 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b"] Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.696817 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sswq6\" (UniqueName: \"kubernetes.io/projected/343b5811-baf3-443e-a8fe-074f7b844d14-kube-api-access-sswq6\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.697043 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-util\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.697329 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-bundle\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.799357 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-bundle\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.799463 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sswq6\" (UniqueName: \"kubernetes.io/projected/343b5811-baf3-443e-a8fe-074f7b844d14-kube-api-access-sswq6\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.799526 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-util\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.799897 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-bundle\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.799989 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-util\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.822891 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sswq6\" (UniqueName: \"kubernetes.io/projected/343b5811-baf3-443e-a8fe-074f7b844d14-kube-api-access-sswq6\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.870298 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:46 crc kubenswrapper[4781]: I0227 00:22:46.342675 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b"] Feb 27 00:22:46 crc kubenswrapper[4781]: I0227 00:22:46.365348 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" event={"ID":"343b5811-baf3-443e-a8fe-074f7b844d14","Type":"ContainerStarted","Data":"b5cc513bb3039586b4f615fb8a649c5385cbc55e7cadc346698a4db8fb0b0af7"} Feb 27 00:22:47 crc kubenswrapper[4781]: I0227 00:22:47.372115 4781 generic.go:334] "Generic (PLEG): container finished" podID="343b5811-baf3-443e-a8fe-074f7b844d14" containerID="b25efba51952e564d3e981d032e4b63e9e54af39849e65b027cfac79467364c7" exitCode=0 Feb 27 00:22:47 crc kubenswrapper[4781]: I0227 00:22:47.372370 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" event={"ID":"343b5811-baf3-443e-a8fe-074f7b844d14","Type":"ContainerDied","Data":"b25efba51952e564d3e981d032e4b63e9e54af39849e65b027cfac79467364c7"} Feb 27 00:22:48 crc kubenswrapper[4781]: I0227 00:22:48.020566 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:48 crc kubenswrapper[4781]: I0227 00:22:48.020676 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:48 crc kubenswrapper[4781]: I0227 00:22:48.058080 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:48 crc kubenswrapper[4781]: I0227 00:22:48.382860 4781 generic.go:334] "Generic (PLEG): container finished" podID="343b5811-baf3-443e-a8fe-074f7b844d14" containerID="20931fb0e24304c280dec3a74e5cb7c1a581d67c8b4c7fcf99bc0128ac8d0526" exitCode=0 Feb 27 00:22:48 crc kubenswrapper[4781]: I0227 00:22:48.384834 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" event={"ID":"343b5811-baf3-443e-a8fe-074f7b844d14","Type":"ContainerDied","Data":"20931fb0e24304c280dec3a74e5cb7c1a581d67c8b4c7fcf99bc0128ac8d0526"} Feb 27 00:22:48 crc kubenswrapper[4781]: I0227 00:22:48.432740 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:49 crc kubenswrapper[4781]: I0227 00:22:49.067112 4781 scope.go:117] "RemoveContainer" containerID="96bd641ff5c28b0d487d9f55a81f55a83bc758e496b0e0a0d2639cc8d0b260d5" Feb 27 00:22:49 crc kubenswrapper[4781]: I0227 00:22:49.391003 4781 generic.go:334] "Generic (PLEG): container finished" podID="343b5811-baf3-443e-a8fe-074f7b844d14" containerID="59a035570e9bebe7c38eff7b66207812fc41e675067fc11daaac1296a49c6bb2" exitCode=0 Feb 27 00:22:49 crc kubenswrapper[4781]: I0227 00:22:49.391130 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" event={"ID":"343b5811-baf3-443e-a8fe-074f7b844d14","Type":"ContainerDied","Data":"59a035570e9bebe7c38eff7b66207812fc41e675067fc11daaac1296a49c6bb2"} Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.711455 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.867799 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sswq6\" (UniqueName: \"kubernetes.io/projected/343b5811-baf3-443e-a8fe-074f7b844d14-kube-api-access-sswq6\") pod \"343b5811-baf3-443e-a8fe-074f7b844d14\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.867882 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-util\") pod \"343b5811-baf3-443e-a8fe-074f7b844d14\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.867933 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-bundle\") pod \"343b5811-baf3-443e-a8fe-074f7b844d14\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.869037 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-bundle" (OuterVolumeSpecName: "bundle") pod "343b5811-baf3-443e-a8fe-074f7b844d14" (UID: "343b5811-baf3-443e-a8fe-074f7b844d14"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.875247 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343b5811-baf3-443e-a8fe-074f7b844d14-kube-api-access-sswq6" (OuterVolumeSpecName: "kube-api-access-sswq6") pod "343b5811-baf3-443e-a8fe-074f7b844d14" (UID: "343b5811-baf3-443e-a8fe-074f7b844d14"). InnerVolumeSpecName "kube-api-access-sswq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.881096 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-util" (OuterVolumeSpecName: "util") pod "343b5811-baf3-443e-a8fe-074f7b844d14" (UID: "343b5811-baf3-443e-a8fe-074f7b844d14"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.969877 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sswq6\" (UniqueName: \"kubernetes.io/projected/343b5811-baf3-443e-a8fe-074f7b844d14-kube-api-access-sswq6\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.970385 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-util\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.970582 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:51 crc kubenswrapper[4781]: I0227 00:22:51.408105 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" event={"ID":"343b5811-baf3-443e-a8fe-074f7b844d14","Type":"ContainerDied","Data":"b5cc513bb3039586b4f615fb8a649c5385cbc55e7cadc346698a4db8fb0b0af7"} Feb 27 00:22:51 crc kubenswrapper[4781]: I0227 00:22:51.408136 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:51 crc kubenswrapper[4781]: I0227 00:22:51.408151 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5cc513bb3039586b4f615fb8a649c5385cbc55e7cadc346698a4db8fb0b0af7" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.085292 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpbl"] Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.085753 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xhpbl" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="registry-server" containerID="cri-o://4e7d81148b9a631f02e9b59a7880d37ff51e7b7fc6595218d4abbd94b9c5c429" gracePeriod=2 Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.420744 4781 generic.go:334] "Generic (PLEG): container finished" podID="2595ff69-b633-443a-81ca-238982513cf4" containerID="4e7d81148b9a631f02e9b59a7880d37ff51e7b7fc6595218d4abbd94b9c5c429" exitCode=0 Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.421021 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpbl" event={"ID":"2595ff69-b633-443a-81ca-238982513cf4","Type":"ContainerDied","Data":"4e7d81148b9a631f02e9b59a7880d37ff51e7b7fc6595218d4abbd94b9c5c429"} Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.526686 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.623586 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9p6p\" (UniqueName: \"kubernetes.io/projected/2595ff69-b633-443a-81ca-238982513cf4-kube-api-access-d9p6p\") pod \"2595ff69-b633-443a-81ca-238982513cf4\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.623658 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-catalog-content\") pod \"2595ff69-b633-443a-81ca-238982513cf4\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.623705 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-utilities\") pod \"2595ff69-b633-443a-81ca-238982513cf4\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.624818 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-utilities" (OuterVolumeSpecName: "utilities") pod "2595ff69-b633-443a-81ca-238982513cf4" (UID: "2595ff69-b633-443a-81ca-238982513cf4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.632846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2595ff69-b633-443a-81ca-238982513cf4-kube-api-access-d9p6p" (OuterVolumeSpecName: "kube-api-access-d9p6p") pod "2595ff69-b633-443a-81ca-238982513cf4" (UID: "2595ff69-b633-443a-81ca-238982513cf4"). InnerVolumeSpecName "kube-api-access-d9p6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.659107 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2595ff69-b633-443a-81ca-238982513cf4" (UID: "2595ff69-b633-443a-81ca-238982513cf4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.724702 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.724740 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9p6p\" (UniqueName: \"kubernetes.io/projected/2595ff69-b633-443a-81ca-238982513cf4-kube-api-access-d9p6p\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.724749 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.429703 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpbl" event={"ID":"2595ff69-b633-443a-81ca-238982513cf4","Type":"ContainerDied","Data":"d510acce79081bd256d287a8b249b7c90204000cf94cd34fe8e254caa90ba14b"} Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.429744 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.430078 4781 scope.go:117] "RemoveContainer" containerID="4e7d81148b9a631f02e9b59a7880d37ff51e7b7fc6595218d4abbd94b9c5c429" Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.468233 4781 scope.go:117] "RemoveContainer" containerID="3c9631be4e58fde1db46b99f19a1e6807b18aacae49000c04cc5603d62ba18fa" Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.479789 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpbl"] Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.487863 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpbl"] Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.497821 4781 scope.go:117] "RemoveContainer" containerID="72146b21e6a598f0d19679595e182fcde5a67855f58589e196415993f2a3b7f4" Feb 27 00:22:55 crc kubenswrapper[4781]: I0227 00:22:55.318108 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2595ff69-b633-443a-81ca-238982513cf4" path="/var/lib/kubelet/pods/2595ff69-b633-443a-81ca-238982513cf4/volumes" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.888827 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb"] Feb 27 00:22:56 crc kubenswrapper[4781]: E0227 00:22:56.889561 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="registry-server" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.889589 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="registry-server" Feb 27 00:22:56 crc kubenswrapper[4781]: E0227 00:22:56.889615 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="extract-utilities" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.889658 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="extract-utilities" Feb 27 00:22:56 crc kubenswrapper[4781]: E0227 00:22:56.889692 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="extract" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.889705 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="extract" Feb 27 00:22:56 crc kubenswrapper[4781]: E0227 00:22:56.889729 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="pull" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.889741 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="pull" Feb 27 00:22:56 crc kubenswrapper[4781]: E0227 00:22:56.889756 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="extract-content" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.889770 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="extract-content" Feb 27 00:22:56 crc kubenswrapper[4781]: E0227 00:22:56.889790 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="util" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.889802 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="util" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.890011 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="extract" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.890035 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="registry-server" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.890791 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.893723 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-49nbd" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.958052 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb"] Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.982527 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvzs9\" (UniqueName: \"kubernetes.io/projected/837579c4-87be-4ce8-94ff-bf25307562db-kube-api-access-hvzs9\") pod \"openstack-operator-controller-init-85cf9d4d7d-cl7rb\" (UID: \"837579c4-87be-4ce8-94ff-bf25307562db\") " pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:22:57 crc kubenswrapper[4781]: I0227 00:22:57.083787 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvzs9\" (UniqueName: \"kubernetes.io/projected/837579c4-87be-4ce8-94ff-bf25307562db-kube-api-access-hvzs9\") pod \"openstack-operator-controller-init-85cf9d4d7d-cl7rb\" (UID: \"837579c4-87be-4ce8-94ff-bf25307562db\") " pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:22:57 crc kubenswrapper[4781]: I0227 00:22:57.101573 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvzs9\" (UniqueName: \"kubernetes.io/projected/837579c4-87be-4ce8-94ff-bf25307562db-kube-api-access-hvzs9\") pod \"openstack-operator-controller-init-85cf9d4d7d-cl7rb\" (UID: \"837579c4-87be-4ce8-94ff-bf25307562db\") " pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:22:57 crc kubenswrapper[4781]: I0227 00:22:57.213910 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:22:57 crc kubenswrapper[4781]: I0227 00:22:57.772922 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb"] Feb 27 00:22:58 crc kubenswrapper[4781]: I0227 00:22:58.461838 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" event={"ID":"837579c4-87be-4ce8-94ff-bf25307562db","Type":"ContainerStarted","Data":"cc33ba6650701b3cee9761acdb9a2ccc4cfb594b7a51d01c95ed5ecef8fd1322"} Feb 27 00:23:02 crc kubenswrapper[4781]: I0227 00:23:02.493956 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" event={"ID":"837579c4-87be-4ce8-94ff-bf25307562db","Type":"ContainerStarted","Data":"96a3c846f670e5eb6b260f2637021742a1d1d083d4c71b4991afdcf3a74c14ee"} Feb 27 00:23:02 crc kubenswrapper[4781]: I0227 00:23:02.494431 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:23:02 crc kubenswrapper[4781]: I0227 00:23:02.526708 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" podStartSLOduration=2.69838488 podStartE2EDuration="6.526688472s" podCreationTimestamp="2026-02-27 00:22:56 +0000 UTC" firstStartedPulling="2026-02-27 00:22:57.785139702 +0000 UTC m=+1047.042679256" lastFinishedPulling="2026-02-27 00:23:01.613443284 +0000 UTC m=+1050.870982848" observedRunningTime="2026-02-27 00:23:02.522281777 +0000 UTC m=+1051.779821331" watchObservedRunningTime="2026-02-27 00:23:02.526688472 +0000 UTC m=+1051.784228026" Feb 27 00:23:07 crc kubenswrapper[4781]: I0227 00:23:07.217567 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:23:12 crc kubenswrapper[4781]: I0227 00:23:12.895369 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:23:12 crc kubenswrapper[4781]: I0227 00:23:12.895952 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.721795 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.723104 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.725478 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-dtmpw" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.729820 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.730649 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.738096 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-nqnlf" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.740404 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.741257 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.744400 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-p57jl" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.749608 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.753593 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.776580 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.793605 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.794683 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.796156 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-2gkks" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.802141 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.802974 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.808754 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-4mv5w" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.821303 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.822148 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.822747 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlngn\" (UniqueName: \"kubernetes.io/projected/fe1f6a92-751f-417e-b2ff-694c10210db7-kube-api-access-jlngn\") pod \"barbican-operator-controller-manager-868647ff47-rfwpm\" (UID: \"fe1f6a92-751f-417e-b2ff-694c10210db7\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.822787 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4jfx\" (UniqueName: \"kubernetes.io/projected/bd77d7fe-85fb-4b16-aa12-75359b52e139-kube-api-access-p4jfx\") pod \"designate-operator-controller-manager-6d8bf5c495-rn44b\" (UID: \"bd77d7fe-85fb-4b16-aa12-75359b52e139\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.822817 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79tf8\" (UniqueName: \"kubernetes.io/projected/e4d59c4e-1fd2-43d9-8ac2-d162e746e758-kube-api-access-79tf8\") pod \"cinder-operator-controller-manager-55d77d7b5c-fb2wf\" (UID: \"e4d59c4e-1fd2-43d9-8ac2-d162e746e758\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.830733 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.831589 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.841563 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.843894 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.844166 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-xzphw" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.844297 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-dlhw4" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.849510 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.856589 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.857816 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.864480 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-whkbx" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.901694 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925500 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925574 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww586\" (UniqueName: \"kubernetes.io/projected/c1807c06-6c68-477c-8725-5702e2d59c93-kube-api-access-ww586\") pod \"horizon-operator-controller-manager-5b9b8895d5-fmbwz\" (UID: \"c1807c06-6c68-477c-8725-5702e2d59c93\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925602 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrds7\" (UniqueName: \"kubernetes.io/projected/6739bbb3-bf62-4b1d-8dd7-3accde691e66-kube-api-access-mrds7\") pod \"heat-operator-controller-manager-69f49c598c-nfzvw\" (UID: \"6739bbb3-bf62-4b1d-8dd7-3accde691e66\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925656 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpr5p\" (UniqueName: \"kubernetes.io/projected/771a50fd-33f6-47ba-ac4a-46da5446cdd8-kube-api-access-vpr5p\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925712 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlngn\" (UniqueName: \"kubernetes.io/projected/fe1f6a92-751f-417e-b2ff-694c10210db7-kube-api-access-jlngn\") pod \"barbican-operator-controller-manager-868647ff47-rfwpm\" (UID: \"fe1f6a92-751f-417e-b2ff-694c10210db7\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925738 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnj2c\" (UniqueName: \"kubernetes.io/projected/66c995b3-f763-455e-8ea3-7dfdfb4c4301-kube-api-access-rnj2c\") pod \"glance-operator-controller-manager-784b5bb6c5-4gl88\" (UID: \"66c995b3-f763-455e-8ea3-7dfdfb4c4301\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925785 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4jfx\" (UniqueName: \"kubernetes.io/projected/bd77d7fe-85fb-4b16-aa12-75359b52e139-kube-api-access-p4jfx\") pod \"designate-operator-controller-manager-6d8bf5c495-rn44b\" (UID: \"bd77d7fe-85fb-4b16-aa12-75359b52e139\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925815 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79tf8\" (UniqueName: \"kubernetes.io/projected/e4d59c4e-1fd2-43d9-8ac2-d162e746e758-kube-api-access-79tf8\") pod \"cinder-operator-controller-manager-55d77d7b5c-fb2wf\" (UID: \"e4d59c4e-1fd2-43d9-8ac2-d162e746e758\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925891 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj2ws\" (UniqueName: \"kubernetes.io/projected/513da4ed-be63-45dd-a32a-27ac3ef443a5-kube-api-access-sj2ws\") pod \"ironic-operator-controller-manager-554564d7fc-szs2w\" (UID: \"513da4ed-be63-45dd-a32a-27ac3ef443a5\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.006093 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79tf8\" (UniqueName: \"kubernetes.io/projected/e4d59c4e-1fd2-43d9-8ac2-d162e746e758-kube-api-access-79tf8\") pod \"cinder-operator-controller-manager-55d77d7b5c-fb2wf\" (UID: \"e4d59c4e-1fd2-43d9-8ac2-d162e746e758\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.007996 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4jfx\" (UniqueName: \"kubernetes.io/projected/bd77d7fe-85fb-4b16-aa12-75359b52e139-kube-api-access-p4jfx\") pod \"designate-operator-controller-manager-6d8bf5c495-rn44b\" (UID: \"bd77d7fe-85fb-4b16-aa12-75359b52e139\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.008935 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlngn\" (UniqueName: \"kubernetes.io/projected/fe1f6a92-751f-417e-b2ff-694c10210db7-kube-api-access-jlngn\") pod \"barbican-operator-controller-manager-868647ff47-rfwpm\" (UID: \"fe1f6a92-751f-417e-b2ff-694c10210db7\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.012816 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.017002 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.026677 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.027583 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.028785 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpr5p\" (UniqueName: \"kubernetes.io/projected/771a50fd-33f6-47ba-ac4a-46da5446cdd8-kube-api-access-vpr5p\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.028818 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnj2c\" (UniqueName: \"kubernetes.io/projected/66c995b3-f763-455e-8ea3-7dfdfb4c4301-kube-api-access-rnj2c\") pod \"glance-operator-controller-manager-784b5bb6c5-4gl88\" (UID: \"66c995b3-f763-455e-8ea3-7dfdfb4c4301\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.028883 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj2ws\" (UniqueName: \"kubernetes.io/projected/513da4ed-be63-45dd-a32a-27ac3ef443a5-kube-api-access-sj2ws\") pod \"ironic-operator-controller-manager-554564d7fc-szs2w\" (UID: \"513da4ed-be63-45dd-a32a-27ac3ef443a5\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.028907 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.028927 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww586\" (UniqueName: \"kubernetes.io/projected/c1807c06-6c68-477c-8725-5702e2d59c93-kube-api-access-ww586\") pod \"horizon-operator-controller-manager-5b9b8895d5-fmbwz\" (UID: \"c1807c06-6c68-477c-8725-5702e2d59c93\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.028948 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrds7\" (UniqueName: \"kubernetes.io/projected/6739bbb3-bf62-4b1d-8dd7-3accde691e66-kube-api-access-mrds7\") pod \"heat-operator-controller-manager-69f49c598c-nfzvw\" (UID: \"6739bbb3-bf62-4b1d-8dd7-3accde691e66\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.029467 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.029513 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert podName:771a50fd-33f6-47ba-ac4a-46da5446cdd8 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:28.529496915 +0000 UTC m=+1077.787036469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert") pod "infra-operator-controller-manager-79d975b745-vhmbb" (UID: "771a50fd-33f6-47ba-ac4a-46da5446cdd8") : secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.033573 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-lldlj" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.045605 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.059301 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.074919 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.079362 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.106588 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpr5p\" (UniqueName: \"kubernetes.io/projected/771a50fd-33f6-47ba-ac4a-46da5446cdd8-kube-api-access-vpr5p\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.115291 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrds7\" (UniqueName: \"kubernetes.io/projected/6739bbb3-bf62-4b1d-8dd7-3accde691e66-kube-api-access-mrds7\") pod \"heat-operator-controller-manager-69f49c598c-nfzvw\" (UID: \"6739bbb3-bf62-4b1d-8dd7-3accde691e66\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.115330 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww586\" (UniqueName: \"kubernetes.io/projected/c1807c06-6c68-477c-8725-5702e2d59c93-kube-api-access-ww586\") pod \"horizon-operator-controller-manager-5b9b8895d5-fmbwz\" (UID: \"c1807c06-6c68-477c-8725-5702e2d59c93\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.115823 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj2ws\" (UniqueName: \"kubernetes.io/projected/513da4ed-be63-45dd-a32a-27ac3ef443a5-kube-api-access-sj2ws\") pod \"ironic-operator-controller-manager-554564d7fc-szs2w\" (UID: \"513da4ed-be63-45dd-a32a-27ac3ef443a5\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.115853 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.116559 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnj2c\" (UniqueName: \"kubernetes.io/projected/66c995b3-f763-455e-8ea3-7dfdfb4c4301-kube-api-access-rnj2c\") pod \"glance-operator-controller-manager-784b5bb6c5-4gl88\" (UID: \"66c995b3-f763-455e-8ea3-7dfdfb4c4301\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.116598 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.122050 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.131468 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqq4t\" (UniqueName: \"kubernetes.io/projected/057d4c8d-606e-44ea-89ea-fb17b4d63733-kube-api-access-dqq4t\") pod \"keystone-operator-controller-manager-b4d948c87-2pgf6\" (UID: \"057d4c8d-606e-44ea-89ea-fb17b4d63733\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.135777 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.136569 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.136891 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-tzjtp" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.139505 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-94jbq" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.152362 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.153954 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.183138 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.184020 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.197979 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-2rnp9" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.199748 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.213120 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.220678 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.221564 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.229995 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-p8m82" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.234050 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqq4t\" (UniqueName: \"kubernetes.io/projected/057d4c8d-606e-44ea-89ea-fb17b4d63733-kube-api-access-dqq4t\") pod \"keystone-operator-controller-manager-b4d948c87-2pgf6\" (UID: \"057d4c8d-606e-44ea-89ea-fb17b4d63733\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.234092 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffzrf\" (UniqueName: \"kubernetes.io/projected/a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad-kube-api-access-ffzrf\") pod \"manila-operator-controller-manager-67d996989d-jnhdb\" (UID: \"a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.234128 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp4z9\" (UniqueName: \"kubernetes.io/projected/f777df4b-1040-4f86-a816-ea778b9e5dc3-kube-api-access-fp4z9\") pod \"mariadb-operator-controller-manager-6994f66f48-w5wp5\" (UID: \"f777df4b-1040-4f86-a816-ea778b9e5dc3\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.242844 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.272691 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.273571 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.284706 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.299688 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqq4t\" (UniqueName: \"kubernetes.io/projected/057d4c8d-606e-44ea-89ea-fb17b4d63733-kube-api-access-dqq4t\") pod \"keystone-operator-controller-manager-b4d948c87-2pgf6\" (UID: \"057d4c8d-606e-44ea-89ea-fb17b4d63733\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.300196 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ml5q6" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.335649 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp4z9\" (UniqueName: \"kubernetes.io/projected/f777df4b-1040-4f86-a816-ea778b9e5dc3-kube-api-access-fp4z9\") pod \"mariadb-operator-controller-manager-6994f66f48-w5wp5\" (UID: \"f777df4b-1040-4f86-a816-ea778b9e5dc3\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.335737 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxc2d\" (UniqueName: \"kubernetes.io/projected/fe25346c-5f31-478e-a639-060c5958b1eb-kube-api-access-vxc2d\") pod \"neutron-operator-controller-manager-6bd4687957-v5hwb\" (UID: \"fe25346c-5f31-478e-a639-060c5958b1eb\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.335765 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l84m\" (UniqueName: \"kubernetes.io/projected/7d5e1e13-5ce4-48ba-a8c9-3db924e63840-kube-api-access-6l84m\") pod \"octavia-operator-controller-manager-659dc6bbfc-tb298\" (UID: \"7d5e1e13-5ce4-48ba-a8c9-3db924e63840\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.335793 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78p8s\" (UniqueName: \"kubernetes.io/projected/e9a3b900-688c-4043-b1ff-53ae1c3ee1d6-kube-api-access-78p8s\") pod \"nova-operator-controller-manager-567668f5cf-trb7t\" (UID: \"e9a3b900-688c-4043-b1ff-53ae1c3ee1d6\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.335828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffzrf\" (UniqueName: \"kubernetes.io/projected/a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad-kube-api-access-ffzrf\") pod \"manila-operator-controller-manager-67d996989d-jnhdb\" (UID: \"a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.339879 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.356131 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.361667 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-blcnh" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.361785 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.363061 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp4z9\" (UniqueName: \"kubernetes.io/projected/f777df4b-1040-4f86-a816-ea778b9e5dc3-kube-api-access-fp4z9\") pod \"mariadb-operator-controller-manager-6994f66f48-w5wp5\" (UID: \"f777df4b-1040-4f86-a816-ea778b9e5dc3\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.366647 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffzrf\" (UniqueName: \"kubernetes.io/projected/a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad-kube-api-access-ffzrf\") pod \"manila-operator-controller-manager-67d996989d-jnhdb\" (UID: \"a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.370663 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.375613 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.407779 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.408982 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.436745 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.437944 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.438305 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb9s6\" (UniqueName: \"kubernetes.io/projected/83466be2-d230-4516-b594-ee56aae3c510-kube-api-access-qb9s6\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.450925 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-tc89m" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.452427 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxc2d\" (UniqueName: \"kubernetes.io/projected/fe25346c-5f31-478e-a639-060c5958b1eb-kube-api-access-vxc2d\") pod \"neutron-operator-controller-manager-6bd4687957-v5hwb\" (UID: \"fe25346c-5f31-478e-a639-060c5958b1eb\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.452492 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.452511 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l84m\" (UniqueName: \"kubernetes.io/projected/7d5e1e13-5ce4-48ba-a8c9-3db924e63840-kube-api-access-6l84m\") pod \"octavia-operator-controller-manager-659dc6bbfc-tb298\" (UID: \"7d5e1e13-5ce4-48ba-a8c9-3db924e63840\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.452567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78p8s\" (UniqueName: \"kubernetes.io/projected/e9a3b900-688c-4043-b1ff-53ae1c3ee1d6-kube-api-access-78p8s\") pod \"nova-operator-controller-manager-567668f5cf-trb7t\" (UID: \"e9a3b900-688c-4043-b1ff-53ae1c3ee1d6\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.461811 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.462702 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.468267 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-bv9s7" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.468410 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.482059 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78p8s\" (UniqueName: \"kubernetes.io/projected/e9a3b900-688c-4043-b1ff-53ae1c3ee1d6-kube-api-access-78p8s\") pod \"nova-operator-controller-manager-567668f5cf-trb7t\" (UID: \"e9a3b900-688c-4043-b1ff-53ae1c3ee1d6\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.484246 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.485234 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.487588 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxc2d\" (UniqueName: \"kubernetes.io/projected/fe25346c-5f31-478e-a639-060c5958b1eb-kube-api-access-vxc2d\") pod \"neutron-operator-controller-manager-6bd4687957-v5hwb\" (UID: \"fe25346c-5f31-478e-a639-060c5958b1eb\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.487859 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-zrhpq" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.489902 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l84m\" (UniqueName: \"kubernetes.io/projected/7d5e1e13-5ce4-48ba-a8c9-3db924e63840-kube-api-access-6l84m\") pod \"octavia-operator-controller-manager-659dc6bbfc-tb298\" (UID: \"7d5e1e13-5ce4-48ba-a8c9-3db924e63840\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.495597 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.499797 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.523896 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.534977 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.535841 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.542933 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.544028 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-6jtdz" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.556369 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zrxh\" (UniqueName: \"kubernetes.io/projected/3747ddf8-799c-441c-bd9d-4450bdb72382-kube-api-access-6zrxh\") pod \"swift-operator-controller-manager-68f46476f-5mgl8\" (UID: \"3747ddf8-799c-441c-bd9d-4450bdb72382\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.556458 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.556515 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rvr4\" (UniqueName: \"kubernetes.io/projected/9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1-kube-api-access-4rvr4\") pod \"placement-operator-controller-manager-8497b45c89-rn2vt\" (UID: \"9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.556567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.556594 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c4x8\" (UniqueName: \"kubernetes.io/projected/fae0f5f8-e721-4ef1-9c8f-4574f156913f-kube-api-access-7c4x8\") pod \"ovn-operator-controller-manager-5955d8c787-bvdd5\" (UID: \"fae0f5f8-e721-4ef1-9c8f-4574f156913f\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.556614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb9s6\" (UniqueName: \"kubernetes.io/projected/83466be2-d230-4516-b594-ee56aae3c510-kube-api-access-qb9s6\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.556619 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.556733 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.556752 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert podName:83466be2-d230-4516-b594-ee56aae3c510 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:29.056729114 +0000 UTC m=+1078.314268668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" (UID: "83466be2-d230-4516-b594-ee56aae3c510") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.556796 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert podName:771a50fd-33f6-47ba-ac4a-46da5446cdd8 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:29.556775826 +0000 UTC m=+1078.814315380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert") pod "infra-operator-controller-manager-79d975b745-vhmbb" (UID: "771a50fd-33f6-47ba-ac4a-46da5446cdd8") : secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.561758 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.562654 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.586878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb9s6\" (UniqueName: \"kubernetes.io/projected/83466be2-d230-4516-b594-ee56aae3c510-kube-api-access-qb9s6\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.587156 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.650423 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.653866 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.656010 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-n4k5g" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.658586 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c4x8\" (UniqueName: \"kubernetes.io/projected/fae0f5f8-e721-4ef1-9c8f-4574f156913f-kube-api-access-7c4x8\") pod \"ovn-operator-controller-manager-5955d8c787-bvdd5\" (UID: \"fae0f5f8-e721-4ef1-9c8f-4574f156913f\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.658627 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zrxh\" (UniqueName: \"kubernetes.io/projected/3747ddf8-799c-441c-bd9d-4450bdb72382-kube-api-access-6zrxh\") pod \"swift-operator-controller-manager-68f46476f-5mgl8\" (UID: \"3747ddf8-799c-441c-bd9d-4450bdb72382\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.658918 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p6kf\" (UniqueName: \"kubernetes.io/projected/11361a5e-18c5-448a-8b07-8f5e3245f607-kube-api-access-7p6kf\") pod \"telemetry-operator-controller-manager-9d678b567-gttml\" (UID: \"11361a5e-18c5-448a-8b07-8f5e3245f607\") " pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.659009 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rvr4\" (UniqueName: \"kubernetes.io/projected/9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1-kube-api-access-4rvr4\") pod \"placement-operator-controller-manager-8497b45c89-rn2vt\" (UID: \"9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.662037 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.676525 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.693840 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.694645 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zrxh\" (UniqueName: \"kubernetes.io/projected/3747ddf8-799c-441c-bd9d-4450bdb72382-kube-api-access-6zrxh\") pod \"swift-operator-controller-manager-68f46476f-5mgl8\" (UID: \"3747ddf8-799c-441c-bd9d-4450bdb72382\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.695854 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-8c68w" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.698588 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.707423 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rvr4\" (UniqueName: \"kubernetes.io/projected/9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1-kube-api-access-4rvr4\") pod \"placement-operator-controller-manager-8497b45c89-rn2vt\" (UID: \"9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.708092 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c4x8\" (UniqueName: \"kubernetes.io/projected/fae0f5f8-e721-4ef1-9c8f-4574f156913f-kube-api-access-7c4x8\") pod \"ovn-operator-controller-manager-5955d8c787-bvdd5\" (UID: \"fae0f5f8-e721-4ef1-9c8f-4574f156913f\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.710080 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.731605 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.739203 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.741008 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.741219 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-j74ff" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.743729 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.761050 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9s8r\" (UniqueName: \"kubernetes.io/projected/cf1fe81a-282d-4e51-b8d9-d6569a640985-kube-api-access-p9s8r\") pod \"test-operator-controller-manager-5dc6794d5b-dc7k2\" (UID: \"cf1fe81a-282d-4e51-b8d9-d6569a640985\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.761100 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfl55\" (UniqueName: \"kubernetes.io/projected/d31610db-32c1-4c99-9001-ab4504649a75-kube-api-access-sfl55\") pod \"watcher-operator-controller-manager-bccc79885-gs62l\" (UID: \"d31610db-32c1-4c99-9001-ab4504649a75\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.761145 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p6kf\" (UniqueName: \"kubernetes.io/projected/11361a5e-18c5-448a-8b07-8f5e3245f607-kube-api-access-7p6kf\") pod \"telemetry-operator-controller-manager-9d678b567-gttml\" (UID: \"11361a5e-18c5-448a-8b07-8f5e3245f607\") " pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.774907 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.792930 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.793997 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.798279 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.800464 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p6kf\" (UniqueName: \"kubernetes.io/projected/11361a5e-18c5-448a-8b07-8f5e3245f607-kube-api-access-7p6kf\") pod \"telemetry-operator-controller-manager-9d678b567-gttml\" (UID: \"11361a5e-18c5-448a-8b07-8f5e3245f607\") " pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.802150 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-kg2v9" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.807865 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.862831 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9s8r\" (UniqueName: \"kubernetes.io/projected/cf1fe81a-282d-4e51-b8d9-d6569a640985-kube-api-access-p9s8r\") pod \"test-operator-controller-manager-5dc6794d5b-dc7k2\" (UID: \"cf1fe81a-282d-4e51-b8d9-d6569a640985\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.862881 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfl55\" (UniqueName: \"kubernetes.io/projected/d31610db-32c1-4c99-9001-ab4504649a75-kube-api-access-sfl55\") pod \"watcher-operator-controller-manager-bccc79885-gs62l\" (UID: \"d31610db-32c1-4c99-9001-ab4504649a75\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.862913 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grqv8\" (UniqueName: \"kubernetes.io/projected/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-kube-api-access-grqv8\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.862948 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.862989 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhd9z\" (UniqueName: \"kubernetes.io/projected/6d15395c-5ed9-43c8-b7f6-ac16e6e32e70-kube-api-access-fhd9z\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7gr7g\" (UID: \"6d15395c-5ed9-43c8-b7f6-ac16e6e32e70\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.863031 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.892477 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.903174 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9s8r\" (UniqueName: \"kubernetes.io/projected/cf1fe81a-282d-4e51-b8d9-d6569a640985-kube-api-access-p9s8r\") pod \"test-operator-controller-manager-5dc6794d5b-dc7k2\" (UID: \"cf1fe81a-282d-4e51-b8d9-d6569a640985\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.907860 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.914199 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfl55\" (UniqueName: \"kubernetes.io/projected/d31610db-32c1-4c99-9001-ab4504649a75-kube-api-access-sfl55\") pod \"watcher-operator-controller-manager-bccc79885-gs62l\" (UID: \"d31610db-32c1-4c99-9001-ab4504649a75\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.966527 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grqv8\" (UniqueName: \"kubernetes.io/projected/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-kube-api-access-grqv8\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.966574 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.966639 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhd9z\" (UniqueName: \"kubernetes.io/projected/6d15395c-5ed9-43c8-b7f6-ac16e6e32e70-kube-api-access-fhd9z\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7gr7g\" (UID: \"6d15395c-5ed9-43c8-b7f6-ac16e6e32e70\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.966660 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.966799 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.966845 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:29.466830441 +0000 UTC m=+1078.724369995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "metrics-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.967084 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.967107 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:29.467100638 +0000 UTC m=+1078.724640192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.983661 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhd9z\" (UniqueName: \"kubernetes.io/projected/6d15395c-5ed9-43c8-b7f6-ac16e6e32e70-kube-api-access-fhd9z\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7gr7g\" (UID: \"6d15395c-5ed9-43c8-b7f6-ac16e6e32e70\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.985273 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grqv8\" (UniqueName: \"kubernetes.io/projected/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-kube-api-access-grqv8\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.019589 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.043067 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.069664 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.069837 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.069880 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert podName:83466be2-d230-4516-b594-ee56aae3c510 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:30.069867834 +0000 UTC m=+1079.327407388 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" (UID: "83466be2-d230-4516-b594-ee56aae3c510") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.075826 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.081881 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.095264 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.104319 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm"] Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.107837 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6739bbb3_bf62_4b1d_8dd7_3accde691e66.slice/crio-c8d874c90ac7f39ebabac7500db8f41a011950198638aad7b783a07f6bdb6f92 WatchSource:0}: Error finding container c8d874c90ac7f39ebabac7500db8f41a011950198638aad7b783a07f6bdb6f92: Status 404 returned error can't find the container with id c8d874c90ac7f39ebabac7500db8f41a011950198638aad7b783a07f6bdb6f92 Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.134517 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.288525 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz"] Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.300593 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1807c06_6c68_477c_8725_5702e2d59c93.slice/crio-7e627c843460bd7de182acf05ba1b7b604c4be0586d0a39818f12398564f268f WatchSource:0}: Error finding container 7e627c843460bd7de182acf05ba1b7b604c4be0586d0a39818f12398564f268f: Status 404 returned error can't find the container with id 7e627c843460bd7de182acf05ba1b7b604c4be0586d0a39818f12398564f268f Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.321061 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b"] Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.324089 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd77d7fe_85fb_4b16_aa12_75359b52e139.slice/crio-b334f23af544bcdb23e245902b2a580ff297a1dc8c4d52da3bf273b3914fa6bc WatchSource:0}: Error finding container b334f23af544bcdb23e245902b2a580ff297a1dc8c4d52da3bf273b3914fa6bc: Status 404 returned error can't find the container with id b334f23af544bcdb23e245902b2a580ff297a1dc8c4d52da3bf273b3914fa6bc Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.360132 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" event={"ID":"c1807c06-6c68-477c-8725-5702e2d59c93","Type":"ContainerStarted","Data":"7e627c843460bd7de182acf05ba1b7b604c4be0586d0a39818f12398564f268f"} Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.360929 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" event={"ID":"bd77d7fe-85fb-4b16-aa12-75359b52e139","Type":"ContainerStarted","Data":"b334f23af544bcdb23e245902b2a580ff297a1dc8c4d52da3bf273b3914fa6bc"} Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.361719 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" event={"ID":"e4d59c4e-1fd2-43d9-8ac2-d162e746e758","Type":"ContainerStarted","Data":"97a6ec41c3ce1eedeab33905b51fdd25b1760cafc7074733bf3862ffde23fd61"} Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.362549 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" event={"ID":"6739bbb3-bf62-4b1d-8dd7-3accde691e66","Type":"ContainerStarted","Data":"c8d874c90ac7f39ebabac7500db8f41a011950198638aad7b783a07f6bdb6f92"} Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.363542 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" event={"ID":"fe1f6a92-751f-417e-b2ff-694c10210db7","Type":"ContainerStarted","Data":"1f4bbde81c9128842228ac366fd53d66674cc5b244ec0c9beaf14bc661503355"} Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.425975 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.432689 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88"] Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.434527 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe25346c_5f31_478e_a639_060c5958b1eb.slice/crio-58b060d9a69f405cfd8981fcb83cce5701eb693e39ed61f0498941eafbd317ef WatchSource:0}: Error finding container 58b060d9a69f405cfd8981fcb83cce5701eb693e39ed61f0498941eafbd317ef: Status 404 returned error can't find the container with id 58b060d9a69f405cfd8981fcb83cce5701eb693e39ed61f0498941eafbd317ef Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.437991 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.442513 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.447703 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.475547 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.475617 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.475806 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.475855 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:30.475840554 +0000 UTC m=+1079.733380108 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "metrics-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.477099 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.477163 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:30.477145958 +0000 UTC m=+1079.734685512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "webhook-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.577216 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.577408 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.577483 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert podName:771a50fd-33f6-47ba-ac4a-46da5446cdd8 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:31.57746345 +0000 UTC m=+1080.835003004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert") pod "infra-operator-controller-manager-79d975b745-vhmbb" (UID: "771a50fd-33f6-47ba-ac4a-46da5446cdd8") : secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.637147 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.649792 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb"] Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.661800 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6l84m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-659dc6bbfc-tb298_openstack-operators(7d5e1e13-5ce4-48ba-a8c9-3db924e63840): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.662968 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" podUID="7d5e1e13-5ce4-48ba-a8c9-3db924e63840" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.665370 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298"] Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.672991 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4rvr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-rn2vt_openstack-operators(9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.683603 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" podUID="9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.697581 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt"] Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.701779 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fp4z9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-w5wp5_openstack-operators(f777df4b-1040-4f86-a816-ea778b9e5dc3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.702953 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" podUID="f777df4b-1040-4f86-a816-ea778b9e5dc3" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.705187 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.752814 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5"] Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.756415 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfae0f5f8_e721_4ef1_9c8f_4574f156913f.slice/crio-f773426c28e98459761fb10a984e36ed348487207e7a1cf937002fafbc60f04e WatchSource:0}: Error finding container f773426c28e98459761fb10a984e36ed348487207e7a1cf937002fafbc60f04e: Status 404 returned error can't find the container with id f773426c28e98459761fb10a984e36ed348487207e7a1cf937002fafbc60f04e Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.770676 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l"] Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.774840 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd31610db_32c1_4c99_9001_ab4504649a75.slice/crio-26556b50c7ff457d0b0f5b1bf419e89f769e57ef9cdf44f9ecb57d188af3cf1e WatchSource:0}: Error finding container 26556b50c7ff457d0b0f5b1bf419e89f769e57ef9cdf44f9ecb57d188af3cf1e: Status 404 returned error can't find the container with id 26556b50c7ff457d0b0f5b1bf419e89f769e57ef9cdf44f9ecb57d188af3cf1e Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.777014 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sfl55,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-gs62l_openstack-operators(d31610db-32c1-4c99-9001-ab4504649a75): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.777098 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11361a5e_18c5_448a_8b07_8f5e3245f607.slice/crio-4d0ecadaddf7458069bcc13ce6c840b2309e0e3d7c02c93140ac8ec476108dc5 WatchSource:0}: Error finding container 4d0ecadaddf7458069bcc13ce6c840b2309e0e3d7c02c93140ac8ec476108dc5: Status 404 returned error can't find the container with id 4d0ecadaddf7458069bcc13ce6c840b2309e0e3d7c02c93140ac8ec476108dc5 Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.778700 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" podUID="d31610db-32c1-4c99-9001-ab4504649a75" Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.779044 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf1fe81a_282d_4e51_b8d9_d6569a640985.slice/crio-95499e972581b730f0cea49768aa004bf09268400286562361694f99b1cbf4c8 WatchSource:0}: Error finding container 95499e972581b730f0cea49768aa004bf09268400286562361694f99b1cbf4c8: Status 404 returned error can't find the container with id 95499e972581b730f0cea49768aa004bf09268400286562361694f99b1cbf4c8 Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.779104 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml"] Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.780602 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:39a4be8a175d9e84fa6ba159f906a95524540b13,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7p6kf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-9d678b567-gttml_openstack-operators(11361a5e-18c5-448a-8b07-8f5e3245f607): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.781952 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" podUID="11361a5e-18c5-448a-8b07-8f5e3245f607" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.782395 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p9s8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5dc6794d5b-dc7k2_openstack-operators(cf1fe81a-282d-4e51-b8d9-d6569a640985): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.784008 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" podUID="cf1fe81a-282d-4e51-b8d9-d6569a640985" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.788402 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fhd9z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-7gr7g_openstack-operators(6d15395c-5ed9-43c8-b7f6-ac16e6e32e70): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.789724 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g"] Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.789978 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" podUID="6d15395c-5ed9-43c8-b7f6-ac16e6e32e70" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.795290 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2"] Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.098023 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.098521 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.098581 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert podName:83466be2-d230-4516-b594-ee56aae3c510 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:32.098564389 +0000 UTC m=+1081.356103943 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" (UID: "83466be2-d230-4516-b594-ee56aae3c510") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.385242 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" event={"ID":"d31610db-32c1-4c99-9001-ab4504649a75","Type":"ContainerStarted","Data":"26556b50c7ff457d0b0f5b1bf419e89f769e57ef9cdf44f9ecb57d188af3cf1e"} Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.393239 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" podUID="d31610db-32c1-4c99-9001-ab4504649a75" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.395154 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" event={"ID":"f777df4b-1040-4f86-a816-ea778b9e5dc3","Type":"ContainerStarted","Data":"5006b2714949060f6141e8bf358bd04edd9af86f59f2c746c148ad75f86a8685"} Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.396521 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" podUID="f777df4b-1040-4f86-a816-ea778b9e5dc3" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.396694 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" event={"ID":"057d4c8d-606e-44ea-89ea-fb17b4d63733","Type":"ContainerStarted","Data":"54e700b978a1894bf048057301bd7f5c2c25d78229d46ec352288697447febc9"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.399279 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" event={"ID":"fe25346c-5f31-478e-a639-060c5958b1eb","Type":"ContainerStarted","Data":"58b060d9a69f405cfd8981fcb83cce5701eb693e39ed61f0498941eafbd317ef"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.407755 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" event={"ID":"7d5e1e13-5ce4-48ba-a8c9-3db924e63840","Type":"ContainerStarted","Data":"3588b8ee5c71d52bec6cd15f9541e3423f20f5b62a21d1be12c78bc9f096d81c"} Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.409360 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" podUID="7d5e1e13-5ce4-48ba-a8c9-3db924e63840" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.413776 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" event={"ID":"513da4ed-be63-45dd-a32a-27ac3ef443a5","Type":"ContainerStarted","Data":"54a54e61d3fe738352352c66c40b4b668766e1c1757771f9ad158b183e5ee63e"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.417012 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" event={"ID":"11361a5e-18c5-448a-8b07-8f5e3245f607","Type":"ContainerStarted","Data":"4d0ecadaddf7458069bcc13ce6c840b2309e0e3d7c02c93140ac8ec476108dc5"} Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.418166 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:39a4be8a175d9e84fa6ba159f906a95524540b13\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" podUID="11361a5e-18c5-448a-8b07-8f5e3245f607" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.419280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" event={"ID":"e9a3b900-688c-4043-b1ff-53ae1c3ee1d6","Type":"ContainerStarted","Data":"060e8ffe2106a86b9f842715e1a1d726e3f240b1b652c3577e7be28b4e5e1287"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.422310 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" event={"ID":"a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad","Type":"ContainerStarted","Data":"45d3a51fbefc905a529c4fc48673fc6eb0e6d525982fe5d486ec663988b95812"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.429521 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" event={"ID":"9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1","Type":"ContainerStarted","Data":"81a298accb6bbe5272d844b50c60ceb3883903dce2e219da7f617f392d9406d9"} Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.430939 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" podUID="9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.433088 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" event={"ID":"6d15395c-5ed9-43c8-b7f6-ac16e6e32e70","Type":"ContainerStarted","Data":"ce1a0348bf72abb9487d943efd38681162d4fa08b6be47c16c9c3662cc3b2c28"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.435139 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" event={"ID":"cf1fe81a-282d-4e51-b8d9-d6569a640985","Type":"ContainerStarted","Data":"95499e972581b730f0cea49768aa004bf09268400286562361694f99b1cbf4c8"} Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.435776 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" podUID="6d15395c-5ed9-43c8-b7f6-ac16e6e32e70" Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.438378 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" podUID="cf1fe81a-282d-4e51-b8d9-d6569a640985" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.440203 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" event={"ID":"66c995b3-f763-455e-8ea3-7dfdfb4c4301","Type":"ContainerStarted","Data":"edd12e3c2539dadd2f78b86bd4c3fcccd1dbd25f0c6559732f8f5ade5b6c1b27"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.441607 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" event={"ID":"fae0f5f8-e721-4ef1-9c8f-4574f156913f","Type":"ContainerStarted","Data":"f773426c28e98459761fb10a984e36ed348487207e7a1cf937002fafbc60f04e"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.442915 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" event={"ID":"3747ddf8-799c-441c-bd9d-4450bdb72382","Type":"ContainerStarted","Data":"67b2717335587eb39bb913c5d5a6459f2ea1b9d9449ad76347a23f67f3753779"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.507348 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.507494 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.508657 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.508697 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:32.508682628 +0000 UTC m=+1081.766222182 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "metrics-server-cert" not found Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.509679 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.509711 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:32.509701554 +0000 UTC m=+1081.767241108 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "webhook-server-cert" not found Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480001 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" podUID="7d5e1e13-5ce4-48ba-a8c9-3db924e63840" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480130 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:39a4be8a175d9e84fa6ba159f906a95524540b13\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" podUID="11361a5e-18c5-448a-8b07-8f5e3245f607" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480158 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" podUID="d31610db-32c1-4c99-9001-ab4504649a75" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480176 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" podUID="9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480180 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" podUID="6d15395c-5ed9-43c8-b7f6-ac16e6e32e70" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480210 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" podUID="cf1fe81a-282d-4e51-b8d9-d6569a640985" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480216 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" podUID="f777df4b-1040-4f86-a816-ea778b9e5dc3" Feb 27 00:23:31 crc kubenswrapper[4781]: I0227 00:23:31.634218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.635920 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.635963 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert podName:771a50fd-33f6-47ba-ac4a-46da5446cdd8 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:35.635949959 +0000 UTC m=+1084.893489513 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert") pod "infra-operator-controller-manager-79d975b745-vhmbb" (UID: "771a50fd-33f6-47ba-ac4a-46da5446cdd8") : secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:32 crc kubenswrapper[4781]: I0227 00:23:32.150147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:32 crc kubenswrapper[4781]: E0227 00:23:32.150360 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:32 crc kubenswrapper[4781]: E0227 00:23:32.150456 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert podName:83466be2-d230-4516-b594-ee56aae3c510 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:36.150435795 +0000 UTC m=+1085.407975349 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" (UID: "83466be2-d230-4516-b594-ee56aae3c510") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:32 crc kubenswrapper[4781]: I0227 00:23:32.555254 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:32 crc kubenswrapper[4781]: E0227 00:23:32.555445 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 00:23:32 crc kubenswrapper[4781]: I0227 00:23:32.555701 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:32 crc kubenswrapper[4781]: E0227 00:23:32.555739 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:36.555709716 +0000 UTC m=+1085.813249270 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "metrics-server-cert" not found Feb 27 00:23:32 crc kubenswrapper[4781]: E0227 00:23:32.555835 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 00:23:32 crc kubenswrapper[4781]: E0227 00:23:32.555891 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:36.55587564 +0000 UTC m=+1085.813415194 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "webhook-server-cert" not found Feb 27 00:23:35 crc kubenswrapper[4781]: I0227 00:23:35.708450 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:35 crc kubenswrapper[4781]: E0227 00:23:35.708694 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:35 crc kubenswrapper[4781]: E0227 00:23:35.708975 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert podName:771a50fd-33f6-47ba-ac4a-46da5446cdd8 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:43.708954236 +0000 UTC m=+1092.966493790 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert") pod "infra-operator-controller-manager-79d975b745-vhmbb" (UID: "771a50fd-33f6-47ba-ac4a-46da5446cdd8") : secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:36 crc kubenswrapper[4781]: I0227 00:23:36.216934 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:36 crc kubenswrapper[4781]: E0227 00:23:36.217113 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:36 crc kubenswrapper[4781]: E0227 00:23:36.217223 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert podName:83466be2-d230-4516-b594-ee56aae3c510 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:44.217199498 +0000 UTC m=+1093.474739062 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" (UID: "83466be2-d230-4516-b594-ee56aae3c510") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:36 crc kubenswrapper[4781]: I0227 00:23:36.626544 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:36 crc kubenswrapper[4781]: I0227 00:23:36.626738 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:36 crc kubenswrapper[4781]: E0227 00:23:36.626824 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 00:23:36 crc kubenswrapper[4781]: E0227 00:23:36.626955 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 00:23:36 crc kubenswrapper[4781]: E0227 00:23:36.626966 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:44.626927577 +0000 UTC m=+1093.884467171 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "webhook-server-cert" not found Feb 27 00:23:36 crc kubenswrapper[4781]: E0227 00:23:36.627019 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:44.626996308 +0000 UTC m=+1093.884535952 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "metrics-server-cert" not found Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.572332 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" event={"ID":"fae0f5f8-e721-4ef1-9c8f-4574f156913f","Type":"ContainerStarted","Data":"ebab2794d9e030be0865957878bdaa84f8b2f279a1def3bd5ca3f62fdc716e9a"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.572998 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.583014 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" event={"ID":"c1807c06-6c68-477c-8725-5702e2d59c93","Type":"ContainerStarted","Data":"e1e76b1c9f0d463e5f7ab46694ae8e019f9fea48aa6d07e7b6d4666c0655a794"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.583113 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.587981 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" event={"ID":"3747ddf8-799c-441c-bd9d-4450bdb72382","Type":"ContainerStarted","Data":"299a1389dae53d1431a0c0bd7ebf880145d792dbe52ed0c2b39dfb14c873121d"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.588178 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.598237 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" event={"ID":"a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad","Type":"ContainerStarted","Data":"3788d7780bf568a2713303959311410200b328c2042d2cd38c7d8f3aba1e0a19"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.598540 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.607208 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" event={"ID":"e4d59c4e-1fd2-43d9-8ac2-d162e746e758","Type":"ContainerStarted","Data":"6fd7dae2aa3f05d8711ac2cda15fffd04c282e92e56ccc013ae6834e00bd1081"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.607346 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.613976 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" event={"ID":"66c995b3-f763-455e-8ea3-7dfdfb4c4301","Type":"ContainerStarted","Data":"8ac8efa04cc24746772ee35fea845a9d140de729d58d693bb10c72e413c876e1"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.614067 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.617764 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" event={"ID":"6739bbb3-bf62-4b1d-8dd7-3accde691e66","Type":"ContainerStarted","Data":"18dd82aa88f3d058ab7fc03fcc9687487f0bd26f7241f7339aac4aa4c409161b"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.618130 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.627936 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" event={"ID":"fe1f6a92-751f-417e-b2ff-694c10210db7","Type":"ContainerStarted","Data":"0b08e8fd35e96179fd376eaef5cebaa658116cfeb02226b8d7ecd8198e0b5eb3"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.629005 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.645016 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" event={"ID":"bd77d7fe-85fb-4b16-aa12-75359b52e139","Type":"ContainerStarted","Data":"f5c3c43efc443ab99410325394c2ef43279dae4e8ca4f6328f5acecf3c7873e4"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.645485 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.662182 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" event={"ID":"057d4c8d-606e-44ea-89ea-fb17b4d63733","Type":"ContainerStarted","Data":"dd20b9bef99a25d947b9ff22e14ccb489351f961d60c15a816f7d746fdeca5b5"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.662759 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.677581 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" event={"ID":"fe25346c-5f31-478e-a639-060c5958b1eb","Type":"ContainerStarted","Data":"7f7af3e6ab0d27e04e135918941e8b4fdd6816db2486a20d90b12222cb813ba8"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.677992 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.687356 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" podStartSLOduration=2.7920212429999998 podStartE2EDuration="14.687337475s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.761292575 +0000 UTC m=+1079.018832129" lastFinishedPulling="2026-02-27 00:23:41.656608807 +0000 UTC m=+1090.914148361" observedRunningTime="2026-02-27 00:23:42.63933317 +0000 UTC m=+1091.896872724" watchObservedRunningTime="2026-02-27 00:23:42.687337475 +0000 UTC m=+1091.944877029" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.688372 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" podStartSLOduration=3.5054701440000002 podStartE2EDuration="15.688365632s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.442809991 +0000 UTC m=+1078.700349545" lastFinishedPulling="2026-02-27 00:23:41.625705479 +0000 UTC m=+1090.883245033" observedRunningTime="2026-02-27 00:23:42.687809807 +0000 UTC m=+1091.945349361" watchObservedRunningTime="2026-02-27 00:23:42.688365632 +0000 UTC m=+1091.945905186" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.692275 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" event={"ID":"513da4ed-be63-45dd-a32a-27ac3ef443a5","Type":"ContainerStarted","Data":"5dd297adcf1fc3e0814087838651a7c1568abfd0396891b04947722d4e83c15e"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.693005 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.707289 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" event={"ID":"e9a3b900-688c-4043-b1ff-53ae1c3ee1d6","Type":"ContainerStarted","Data":"f0c143f8e1db419222ec838d37b6754fd18b71bbe9ee5893a54f0d84c18d707e"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.708353 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.758812 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" podStartSLOduration=3.458429424 podStartE2EDuration="15.758793152s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.304579348 +0000 UTC m=+1078.562118902" lastFinishedPulling="2026-02-27 00:23:41.604943066 +0000 UTC m=+1090.862482630" observedRunningTime="2026-02-27 00:23:42.723554551 +0000 UTC m=+1091.981094095" watchObservedRunningTime="2026-02-27 00:23:42.758793152 +0000 UTC m=+1092.016332696" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.761284 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" podStartSLOduration=3.762131421 podStartE2EDuration="15.761264307s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.657397819 +0000 UTC m=+1078.914937373" lastFinishedPulling="2026-02-27 00:23:41.656530705 +0000 UTC m=+1090.914070259" observedRunningTime="2026-02-27 00:23:42.758238318 +0000 UTC m=+1092.015777872" watchObservedRunningTime="2026-02-27 00:23:42.761264307 +0000 UTC m=+1092.018803861" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.789037 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" podStartSLOduration=3.271942861 podStartE2EDuration="15.789021852s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.139443103 +0000 UTC m=+1078.396982647" lastFinishedPulling="2026-02-27 00:23:41.656522084 +0000 UTC m=+1090.914061638" observedRunningTime="2026-02-27 00:23:42.783919889 +0000 UTC m=+1092.041459463" watchObservedRunningTime="2026-02-27 00:23:42.789021852 +0000 UTC m=+1092.046561416" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.807258 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" podStartSLOduration=3.290411712 podStartE2EDuration="15.807240888s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.140317785 +0000 UTC m=+1078.397857339" lastFinishedPulling="2026-02-27 00:23:41.657146961 +0000 UTC m=+1090.914686515" observedRunningTime="2026-02-27 00:23:42.804359503 +0000 UTC m=+1092.061899057" watchObservedRunningTime="2026-02-27 00:23:42.807240888 +0000 UTC m=+1092.064780442" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.835530 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" podStartSLOduration=3.376320128 podStartE2EDuration="15.835516057s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.138911479 +0000 UTC m=+1078.396451033" lastFinishedPulling="2026-02-27 00:23:41.598107408 +0000 UTC m=+1090.855646962" observedRunningTime="2026-02-27 00:23:42.832447147 +0000 UTC m=+1092.089986701" watchObservedRunningTime="2026-02-27 00:23:42.835516057 +0000 UTC m=+1092.093055611" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.852643 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" podStartSLOduration=2.834999656 podStartE2EDuration="14.852602854s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.638128556 +0000 UTC m=+1078.895668110" lastFinishedPulling="2026-02-27 00:23:41.655731744 +0000 UTC m=+1090.913271308" observedRunningTime="2026-02-27 00:23:42.849482763 +0000 UTC m=+1092.107022317" watchObservedRunningTime="2026-02-27 00:23:42.852602854 +0000 UTC m=+1092.110142418" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.865978 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" podStartSLOduration=3.637192606 podStartE2EDuration="15.865961363s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.444329831 +0000 UTC m=+1078.701869385" lastFinishedPulling="2026-02-27 00:23:41.673098588 +0000 UTC m=+1090.930638142" observedRunningTime="2026-02-27 00:23:42.864779142 +0000 UTC m=+1092.122318696" watchObservedRunningTime="2026-02-27 00:23:42.865961363 +0000 UTC m=+1092.123500917" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.886376 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" podStartSLOduration=3.720034542 podStartE2EDuration="15.886355676s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.442676758 +0000 UTC m=+1078.700216312" lastFinishedPulling="2026-02-27 00:23:41.608997892 +0000 UTC m=+1090.866537446" observedRunningTime="2026-02-27 00:23:42.884440006 +0000 UTC m=+1092.141979570" watchObservedRunningTime="2026-02-27 00:23:42.886355676 +0000 UTC m=+1092.143895240" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.898939 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.899008 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.919840 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" podStartSLOduration=3.640892103 podStartE2EDuration="15.919822591s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.325818683 +0000 UTC m=+1078.583358237" lastFinishedPulling="2026-02-27 00:23:41.604749171 +0000 UTC m=+1090.862288725" observedRunningTime="2026-02-27 00:23:42.914013039 +0000 UTC m=+1092.171552583" watchObservedRunningTime="2026-02-27 00:23:42.919822591 +0000 UTC m=+1092.177362145" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.939930 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" podStartSLOduration=3.7783667359999997 podStartE2EDuration="15.939914826s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.443246293 +0000 UTC m=+1078.700785837" lastFinishedPulling="2026-02-27 00:23:41.604794373 +0000 UTC m=+1090.862333927" observedRunningTime="2026-02-27 00:23:42.937593885 +0000 UTC m=+1092.195133439" watchObservedRunningTime="2026-02-27 00:23:42.939914826 +0000 UTC m=+1092.197454380" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.958474 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" podStartSLOduration=3.678044174 podStartE2EDuration="15.958455921s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.442708508 +0000 UTC m=+1078.700248062" lastFinishedPulling="2026-02-27 00:23:41.723120255 +0000 UTC m=+1090.980659809" observedRunningTime="2026-02-27 00:23:42.955530834 +0000 UTC m=+1092.213070378" watchObservedRunningTime="2026-02-27 00:23:42.958455921 +0000 UTC m=+1092.215995465" Feb 27 00:23:43 crc kubenswrapper[4781]: I0227 00:23:43.754198 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:43 crc kubenswrapper[4781]: I0227 00:23:43.760268 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:43 crc kubenswrapper[4781]: I0227 00:23:43.771412 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-xzphw" Feb 27 00:23:43 crc kubenswrapper[4781]: I0227 00:23:43.780384 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.272280 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.282428 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.290860 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb"] Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.293665 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-blcnh" Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.302348 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.679491 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.679815 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:44 crc kubenswrapper[4781]: E0227 00:23:44.680043 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 00:23:44 crc kubenswrapper[4781]: E0227 00:23:44.680122 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:24:00.680103075 +0000 UTC m=+1109.937642629 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "webhook-server-cert" not found Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.696813 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:45 crc kubenswrapper[4781]: W0227 00:23:45.662846 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod771a50fd_33f6_47ba_ac4a_46da5446cdd8.slice/crio-579e138aef64f73a46ab6e02bc9c9a2614b72c787b7df35236b5d305c9049a8c WatchSource:0}: Error finding container 579e138aef64f73a46ab6e02bc9c9a2614b72c787b7df35236b5d305c9049a8c: Status 404 returned error can't find the container with id 579e138aef64f73a46ab6e02bc9c9a2614b72c787b7df35236b5d305c9049a8c Feb 27 00:23:45 crc kubenswrapper[4781]: I0227 00:23:45.734998 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" event={"ID":"771a50fd-33f6-47ba-ac4a-46da5446cdd8","Type":"ContainerStarted","Data":"579e138aef64f73a46ab6e02bc9c9a2614b72c787b7df35236b5d305c9049a8c"} Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.050029 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.066195 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.078211 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.126318 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.158820 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.217259 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.379823 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.414253 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.506353 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.545914 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.566238 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.801900 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.896185 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.149549 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535864-cfd4d"] Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.151338 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.153792 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.153805 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.154493 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.157605 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535864-cfd4d"] Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.243683 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfqjm\" (UniqueName: \"kubernetes.io/projected/b9402a6e-66bb-4e1e-a33f-7fce411c83b8-kube-api-access-gfqjm\") pod \"auto-csr-approver-29535864-cfd4d\" (UID: \"b9402a6e-66bb-4e1e-a33f-7fce411c83b8\") " pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.259768 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4"] Feb 27 00:24:00 crc kubenswrapper[4781]: E0227 00:24:00.345653 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a" Feb 27 00:24:00 crc kubenswrapper[4781]: E0227 00:24:00.345854 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fp4z9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-w5wp5_openstack-operators(f777df4b-1040-4f86-a816-ea778b9e5dc3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.345954 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfqjm\" (UniqueName: \"kubernetes.io/projected/b9402a6e-66bb-4e1e-a33f-7fce411c83b8-kube-api-access-gfqjm\") pod \"auto-csr-approver-29535864-cfd4d\" (UID: \"b9402a6e-66bb-4e1e-a33f-7fce411c83b8\") " pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:00 crc kubenswrapper[4781]: E0227 00:24:00.346946 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" podUID="f777df4b-1040-4f86-a816-ea778b9e5dc3" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.369187 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfqjm\" (UniqueName: \"kubernetes.io/projected/b9402a6e-66bb-4e1e-a33f-7fce411c83b8-kube-api-access-gfqjm\") pod \"auto-csr-approver-29535864-cfd4d\" (UID: \"b9402a6e-66bb-4e1e-a33f-7fce411c83b8\") " pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.479249 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.754475 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.761549 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.894111 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-j74ff" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.902315 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:24:01 crc kubenswrapper[4781]: E0227 00:24:01.616424 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd" Feb 27 00:24:01 crc kubenswrapper[4781]: E0227 00:24:01.616754 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4rvr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-rn2vt_openstack-operators(9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:24:01 crc kubenswrapper[4781]: E0227 00:24:01.618076 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" podUID="9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.011242 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.011441 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fhd9z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-7gr7g_openstack-operators(6d15395c-5ed9-43c8-b7f6-ac16e6e32e70): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.012618 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" podUID="6d15395c-5ed9-43c8-b7f6-ac16e6e32e70" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.468285 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.468741 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sfl55,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-gs62l_openstack-operators(d31610db-32c1-4c99-9001-ab4504649a75): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.470969 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" podUID="d31610db-32c1-4c99-9001-ab4504649a75" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.853940 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.854127 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vpr5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-79d975b745-vhmbb_openstack-operators(771a50fd-33f6-47ba-ac4a-46da5446cdd8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.855269 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" podUID="771a50fd-33f6-47ba-ac4a-46da5446cdd8" Feb 27 00:24:02 crc kubenswrapper[4781]: I0227 00:24:02.880374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" event={"ID":"83466be2-d230-4516-b594-ee56aae3c510","Type":"ContainerStarted","Data":"c7c04ffaf4b66c6d34f22072c332cd7fa571642346c94a35e0c138e7e21df50e"} Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.881906 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a\\\"\"" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" podUID="771a50fd-33f6-47ba-ac4a-46da5446cdd8" Feb 27 00:24:04 crc kubenswrapper[4781]: E0227 00:24:04.989235 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98" Feb 27 00:24:04 crc kubenswrapper[4781]: E0227 00:24:04.989455 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p9s8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5dc6794d5b-dc7k2_openstack-operators(cf1fe81a-282d-4e51-b8d9-d6569a640985): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:24:04 crc kubenswrapper[4781]: E0227 00:24:04.990611 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" podUID="cf1fe81a-282d-4e51-b8d9-d6569a640985" Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.524253 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535864-cfd4d"] Feb 27 00:24:05 crc kubenswrapper[4781]: W0227 00:24:05.533875 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9402a6e_66bb_4e1e_a33f_7fce411c83b8.slice/crio-0f2effad7d2f9b3a8b364986d3b2496800347b3242d554451b7d4c789942439f WatchSource:0}: Error finding container 0f2effad7d2f9b3a8b364986d3b2496800347b3242d554451b7d4c789942439f: Status 404 returned error can't find the container with id 0f2effad7d2f9b3a8b364986d3b2496800347b3242d554451b7d4c789942439f Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.547404 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf"] Feb 27 00:24:05 crc kubenswrapper[4781]: W0227 00:24:05.555477 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fe881c2_cb59_41ce_a23c_f2dcba86d9c3.slice/crio-efb37c89ced21736b4124b5007f93df1cbca5667cfa95469450f490b000d3e82 WatchSource:0}: Error finding container efb37c89ced21736b4124b5007f93df1cbca5667cfa95469450f490b000d3e82: Status 404 returned error can't find the container with id efb37c89ced21736b4124b5007f93df1cbca5667cfa95469450f490b000d3e82 Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.904616 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" event={"ID":"11361a5e-18c5-448a-8b07-8f5e3245f607","Type":"ContainerStarted","Data":"01161ee19b4dff0a1f69598e67b45b7ac1a1e034e9d63077384c03bbecd1a305"} Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.905129 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.905402 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" event={"ID":"b9402a6e-66bb-4e1e-a33f-7fce411c83b8","Type":"ContainerStarted","Data":"0f2effad7d2f9b3a8b364986d3b2496800347b3242d554451b7d4c789942439f"} Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.906478 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" event={"ID":"7d5e1e13-5ce4-48ba-a8c9-3db924e63840","Type":"ContainerStarted","Data":"c9bc986b5b89efcaaf41630bd3949d7ebda64929ff329299873b3f49e8e68663"} Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.906668 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.907937 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" event={"ID":"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3","Type":"ContainerStarted","Data":"bbf00a8c2adc7ae793c27c40c2d41247cf3282ba1fb4da8744e6379a6fd1a1b2"} Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.907973 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" event={"ID":"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3","Type":"ContainerStarted","Data":"efb37c89ced21736b4124b5007f93df1cbca5667cfa95469450f490b000d3e82"} Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.908488 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.921295 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" podStartSLOduration=4.857749066 podStartE2EDuration="37.921280948s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.780468296 +0000 UTC m=+1079.038007850" lastFinishedPulling="2026-02-27 00:24:02.844000178 +0000 UTC m=+1112.101539732" observedRunningTime="2026-02-27 00:24:05.91981077 +0000 UTC m=+1115.177350354" watchObservedRunningTime="2026-02-27 00:24:05.921280948 +0000 UTC m=+1115.178820502" Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.953922 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" podStartSLOduration=37.953906434 podStartE2EDuration="37.953906434s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:24:05.953700889 +0000 UTC m=+1115.211240463" watchObservedRunningTime="2026-02-27 00:24:05.953906434 +0000 UTC m=+1115.211445998" Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.971162 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" podStartSLOduration=5.170375863 podStartE2EDuration="37.971145127s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.661658031 +0000 UTC m=+1078.919197585" lastFinishedPulling="2026-02-27 00:24:02.462427295 +0000 UTC m=+1111.719966849" observedRunningTime="2026-02-27 00:24:05.969177105 +0000 UTC m=+1115.226716669" watchObservedRunningTime="2026-02-27 00:24:05.971145127 +0000 UTC m=+1115.228684681" Feb 27 00:24:06 crc kubenswrapper[4781]: I0227 00:24:06.915578 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" event={"ID":"83466be2-d230-4516-b594-ee56aae3c510","Type":"ContainerStarted","Data":"c09933a142395c89175c8f4b09ce1001f94adf8018377eefdcc29796a34dffef"} Feb 27 00:24:06 crc kubenswrapper[4781]: I0227 00:24:06.916327 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:24:06 crc kubenswrapper[4781]: I0227 00:24:06.954019 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" podStartSLOduration=34.874077018 podStartE2EDuration="38.953996998s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:24:02.466944344 +0000 UTC m=+1111.724483898" lastFinishedPulling="2026-02-27 00:24:06.546864324 +0000 UTC m=+1115.804403878" observedRunningTime="2026-02-27 00:24:06.945035393 +0000 UTC m=+1116.202574947" watchObservedRunningTime="2026-02-27 00:24:06.953996998 +0000 UTC m=+1116.211536552" Feb 27 00:24:07 crc kubenswrapper[4781]: I0227 00:24:07.923086 4781 generic.go:334] "Generic (PLEG): container finished" podID="b9402a6e-66bb-4e1e-a33f-7fce411c83b8" containerID="e7c34540c9407121a9ee96d4e0537e4a13bd65448411272b9cedd072273699e8" exitCode=0 Feb 27 00:24:07 crc kubenswrapper[4781]: I0227 00:24:07.923140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" event={"ID":"b9402a6e-66bb-4e1e-a33f-7fce411c83b8","Type":"ContainerDied","Data":"e7c34540c9407121a9ee96d4e0537e4a13bd65448411272b9cedd072273699e8"} Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.265936 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.372619 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfqjm\" (UniqueName: \"kubernetes.io/projected/b9402a6e-66bb-4e1e-a33f-7fce411c83b8-kube-api-access-gfqjm\") pod \"b9402a6e-66bb-4e1e-a33f-7fce411c83b8\" (UID: \"b9402a6e-66bb-4e1e-a33f-7fce411c83b8\") " Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.377929 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9402a6e-66bb-4e1e-a33f-7fce411c83b8-kube-api-access-gfqjm" (OuterVolumeSpecName: "kube-api-access-gfqjm") pod "b9402a6e-66bb-4e1e-a33f-7fce411c83b8" (UID: "b9402a6e-66bb-4e1e-a33f-7fce411c83b8"). InnerVolumeSpecName "kube-api-access-gfqjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.476364 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfqjm\" (UniqueName: \"kubernetes.io/projected/b9402a6e-66bb-4e1e-a33f-7fce411c83b8-kube-api-access-gfqjm\") on node \"crc\" DevicePath \"\"" Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.942306 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" event={"ID":"b9402a6e-66bb-4e1e-a33f-7fce411c83b8","Type":"ContainerDied","Data":"0f2effad7d2f9b3a8b364986d3b2496800347b3242d554451b7d4c789942439f"} Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.942351 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f2effad7d2f9b3a8b364986d3b2496800347b3242d554451b7d4c789942439f" Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.942759 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:10 crc kubenswrapper[4781]: I0227 00:24:10.330278 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535858-9fs8d"] Feb 27 00:24:10 crc kubenswrapper[4781]: I0227 00:24:10.335614 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535858-9fs8d"] Feb 27 00:24:10 crc kubenswrapper[4781]: I0227 00:24:10.910983 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:24:11 crc kubenswrapper[4781]: I0227 00:24:11.317257 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" path="/var/lib/kubelet/pods/3bb1e1bd-28ea-42f4-96d5-534db2674e68/volumes" Feb 27 00:24:12 crc kubenswrapper[4781]: E0227 00:24:12.310848 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" podUID="9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1" Feb 27 00:24:12 crc kubenswrapper[4781]: E0227 00:24:12.310868 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" podUID="f777df4b-1040-4f86-a816-ea778b9e5dc3" Feb 27 00:24:12 crc kubenswrapper[4781]: I0227 00:24:12.895462 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:24:12 crc kubenswrapper[4781]: I0227 00:24:12.895536 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:24:12 crc kubenswrapper[4781]: I0227 00:24:12.895585 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:24:12 crc kubenswrapper[4781]: I0227 00:24:12.896258 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58cd249b96a5284dbe453e012e30bb3f9acbc9ed9b891c6e44075d418edc5ad9"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:24:12 crc kubenswrapper[4781]: I0227 00:24:12.896318 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://58cd249b96a5284dbe453e012e30bb3f9acbc9ed9b891c6e44075d418edc5ad9" gracePeriod=600 Feb 27 00:24:13 crc kubenswrapper[4781]: I0227 00:24:13.981024 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="58cd249b96a5284dbe453e012e30bb3f9acbc9ed9b891c6e44075d418edc5ad9" exitCode=0 Feb 27 00:24:13 crc kubenswrapper[4781]: I0227 00:24:13.981111 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"58cd249b96a5284dbe453e012e30bb3f9acbc9ed9b891c6e44075d418edc5ad9"} Feb 27 00:24:13 crc kubenswrapper[4781]: I0227 00:24:13.981641 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"40924ce0e5e04646329cd01d3e3dc65fdaf6b21bdd01704d3fa5ed81c86443f6"} Feb 27 00:24:13 crc kubenswrapper[4781]: I0227 00:24:13.981669 4781 scope.go:117] "RemoveContainer" containerID="4a4838ae34a31bed19fe04c8cb77eb7ca161a34e4d168445bf5a5f93e91a959a" Feb 27 00:24:14 crc kubenswrapper[4781]: E0227 00:24:14.311160 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" podUID="d31610db-32c1-4c99-9001-ab4504649a75" Feb 27 00:24:14 crc kubenswrapper[4781]: I0227 00:24:14.315974 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:24:16 crc kubenswrapper[4781]: I0227 00:24:16.019172 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" event={"ID":"771a50fd-33f6-47ba-ac4a-46da5446cdd8","Type":"ContainerStarted","Data":"3055237bd72add8225f237f142360ed0c7c8f63834d929d7742ed234922fc4a2"} Feb 27 00:24:16 crc kubenswrapper[4781]: I0227 00:24:16.020194 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:24:16 crc kubenswrapper[4781]: I0227 00:24:16.045018 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" podStartSLOduration=19.004881071 podStartE2EDuration="49.044984101s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:45.690328487 +0000 UTC m=+1094.947868041" lastFinishedPulling="2026-02-27 00:24:15.730431507 +0000 UTC m=+1124.987971071" observedRunningTime="2026-02-27 00:24:16.041256763 +0000 UTC m=+1125.298796327" watchObservedRunningTime="2026-02-27 00:24:16.044984101 +0000 UTC m=+1125.302523695" Feb 27 00:24:16 crc kubenswrapper[4781]: E0227 00:24:16.309665 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" podUID="6d15395c-5ed9-43c8-b7f6-ac16e6e32e70" Feb 27 00:24:17 crc kubenswrapper[4781]: E0227 00:24:17.310926 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" podUID="cf1fe81a-282d-4e51-b8d9-d6569a640985" Feb 27 00:24:18 crc kubenswrapper[4781]: I0227 00:24:18.666533 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:24:19 crc kubenswrapper[4781]: I0227 00:24:19.024422 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:24:23 crc kubenswrapper[4781]: I0227 00:24:23.789720 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:24:27 crc kubenswrapper[4781]: I0227 00:24:27.128418 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" event={"ID":"9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1","Type":"ContainerStarted","Data":"0fed33bc277b70314e993e7d612c8ba4d799cf63ea7db6799188b04fcbb1701e"} Feb 27 00:24:27 crc kubenswrapper[4781]: I0227 00:24:27.129220 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:24:27 crc kubenswrapper[4781]: I0227 00:24:27.129925 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" event={"ID":"f777df4b-1040-4f86-a816-ea778b9e5dc3","Type":"ContainerStarted","Data":"aa9b6f371f5ef215ee04c8b30092785402f0b962e9163bc2f7a62ef89909297b"} Feb 27 00:24:27 crc kubenswrapper[4781]: I0227 00:24:27.130210 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:24:27 crc kubenswrapper[4781]: I0227 00:24:27.143072 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" podStartSLOduration=2.08160666 podStartE2EDuration="59.143054912s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.672453463 +0000 UTC m=+1078.929993017" lastFinishedPulling="2026-02-27 00:24:26.733901695 +0000 UTC m=+1135.991441269" observedRunningTime="2026-02-27 00:24:27.14107643 +0000 UTC m=+1136.398615994" watchObservedRunningTime="2026-02-27 00:24:27.143054912 +0000 UTC m=+1136.400594466" Feb 27 00:24:27 crc kubenswrapper[4781]: I0227 00:24:27.156043 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" podStartSLOduration=3.04253977 podStartE2EDuration="1m0.156026122s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.701551613 +0000 UTC m=+1078.959091177" lastFinishedPulling="2026-02-27 00:24:26.815037955 +0000 UTC m=+1136.072577529" observedRunningTime="2026-02-27 00:24:27.153336922 +0000 UTC m=+1136.410876496" watchObservedRunningTime="2026-02-27 00:24:27.156026122 +0000 UTC m=+1136.413565676" Feb 27 00:24:30 crc kubenswrapper[4781]: I0227 00:24:30.154896 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" event={"ID":"d31610db-32c1-4c99-9001-ab4504649a75","Type":"ContainerStarted","Data":"82eaab020c799642a67b8d3e650d83ca83cf1f9b9f741b6c7d332936321fa8dd"} Feb 27 00:24:30 crc kubenswrapper[4781]: I0227 00:24:30.155659 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:24:30 crc kubenswrapper[4781]: I0227 00:24:30.173689 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" podStartSLOduration=2.108756161 podStartE2EDuration="1m2.173668918s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.776885842 +0000 UTC m=+1079.034425396" lastFinishedPulling="2026-02-27 00:24:29.841798559 +0000 UTC m=+1139.099338153" observedRunningTime="2026-02-27 00:24:30.1729923 +0000 UTC m=+1139.430531874" watchObservedRunningTime="2026-02-27 00:24:30.173668918 +0000 UTC m=+1139.431208482" Feb 27 00:24:31 crc kubenswrapper[4781]: I0227 00:24:31.162099 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" event={"ID":"6d15395c-5ed9-43c8-b7f6-ac16e6e32e70","Type":"ContainerStarted","Data":"e0c1fd5fb38c3aef7c1dd3b851822a2247022a8fd97f150917ae99dcd6622583"} Feb 27 00:24:31 crc kubenswrapper[4781]: I0227 00:24:31.179727 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" podStartSLOduration=2.266874279 podStartE2EDuration="1m3.179712437s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.788088015 +0000 UTC m=+1079.045627559" lastFinishedPulling="2026-02-27 00:24:30.700926153 +0000 UTC m=+1139.958465717" observedRunningTime="2026-02-27 00:24:31.17714217 +0000 UTC m=+1140.434681724" watchObservedRunningTime="2026-02-27 00:24:31.179712437 +0000 UTC m=+1140.437251991" Feb 27 00:24:33 crc kubenswrapper[4781]: I0227 00:24:33.180017 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" event={"ID":"cf1fe81a-282d-4e51-b8d9-d6569a640985","Type":"ContainerStarted","Data":"18882c8130987e0cf6ead5d8d1c23156356def627df7a15c4f5e2383dc0f395c"} Feb 27 00:24:33 crc kubenswrapper[4781]: I0227 00:24:33.180716 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:24:33 crc kubenswrapper[4781]: I0227 00:24:33.197833 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" podStartSLOduration=2.132804801 podStartE2EDuration="1m5.197813755s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.782233832 +0000 UTC m=+1079.039773386" lastFinishedPulling="2026-02-27 00:24:32.847242786 +0000 UTC m=+1142.104782340" observedRunningTime="2026-02-27 00:24:33.194236511 +0000 UTC m=+1142.451776095" watchObservedRunningTime="2026-02-27 00:24:33.197813755 +0000 UTC m=+1142.455353319" Feb 27 00:24:38 crc kubenswrapper[4781]: I0227 00:24:38.526594 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:24:38 crc kubenswrapper[4781]: I0227 00:24:38.912558 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:24:39 crc kubenswrapper[4781]: I0227 00:24:39.045781 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:24:39 crc kubenswrapper[4781]: I0227 00:24:39.079195 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:24:54 crc kubenswrapper[4781]: I0227 00:24:54.990142 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vdbg9"] Feb 27 00:24:54 crc kubenswrapper[4781]: E0227 00:24:54.990925 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9402a6e-66bb-4e1e-a33f-7fce411c83b8" containerName="oc" Feb 27 00:24:54 crc kubenswrapper[4781]: I0227 00:24:54.990938 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9402a6e-66bb-4e1e-a33f-7fce411c83b8" containerName="oc" Feb 27 00:24:54 crc kubenswrapper[4781]: I0227 00:24:54.991111 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9402a6e-66bb-4e1e-a33f-7fce411c83b8" containerName="oc" Feb 27 00:24:54 crc kubenswrapper[4781]: I0227 00:24:54.992518 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.011195 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.011558 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.011706 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.015873 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-qg6tr" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.026063 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vdbg9"] Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.085668 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fltx\" (UniqueName: \"kubernetes.io/projected/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-kube-api-access-8fltx\") pod \"dnsmasq-dns-675f4bcbfc-vdbg9\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.085721 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-config\") pod \"dnsmasq-dns-675f4bcbfc-vdbg9\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.133489 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6drvh"] Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.134598 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.137358 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.186298 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-config\") pod \"dnsmasq-dns-675f4bcbfc-vdbg9\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.186348 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-config\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.186403 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.186437 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z4t2\" (UniqueName: \"kubernetes.io/projected/5411c548-900f-4d1e-816d-8687268b6ebc-kube-api-access-8z4t2\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.186469 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fltx\" (UniqueName: \"kubernetes.io/projected/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-kube-api-access-8fltx\") pod \"dnsmasq-dns-675f4bcbfc-vdbg9\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.187243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-config\") pod \"dnsmasq-dns-675f4bcbfc-vdbg9\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.197587 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6drvh"] Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.215772 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fltx\" (UniqueName: \"kubernetes.io/projected/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-kube-api-access-8fltx\") pod \"dnsmasq-dns-675f4bcbfc-vdbg9\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.287473 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-config\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.287869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.287913 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z4t2\" (UniqueName: \"kubernetes.io/projected/5411c548-900f-4d1e-816d-8687268b6ebc-kube-api-access-8z4t2\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.288346 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-config\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.288606 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.303145 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z4t2\" (UniqueName: \"kubernetes.io/projected/5411c548-900f-4d1e-816d-8687268b6ebc-kube-api-access-8z4t2\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.311195 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.452176 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.783225 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vdbg9"] Feb 27 00:24:55 crc kubenswrapper[4781]: W0227 00:24:55.785872 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc73aa093_0d39_41f9_a0bd_35e621c4cf8c.slice/crio-060eb712a24d9ca196230b121a82b836df7e85841645aaea05dcf490cac6e106 WatchSource:0}: Error finding container 060eb712a24d9ca196230b121a82b836df7e85841645aaea05dcf490cac6e106: Status 404 returned error can't find the container with id 060eb712a24d9ca196230b121a82b836df7e85841645aaea05dcf490cac6e106 Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.890895 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6drvh"] Feb 27 00:24:55 crc kubenswrapper[4781]: W0227 00:24:55.892295 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5411c548_900f_4d1e_816d_8687268b6ebc.slice/crio-705ae1ac3f4cc27e243e281ed7ae1b8b6a29f990c8825db10f1d6ca4d40585cf WatchSource:0}: Error finding container 705ae1ac3f4cc27e243e281ed7ae1b8b6a29f990c8825db10f1d6ca4d40585cf: Status 404 returned error can't find the container with id 705ae1ac3f4cc27e243e281ed7ae1b8b6a29f990c8825db10f1d6ca4d40585cf Feb 27 00:24:56 crc kubenswrapper[4781]: I0227 00:24:56.385210 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" event={"ID":"c73aa093-0d39-41f9-a0bd-35e621c4cf8c","Type":"ContainerStarted","Data":"060eb712a24d9ca196230b121a82b836df7e85841645aaea05dcf490cac6e106"} Feb 27 00:24:56 crc kubenswrapper[4781]: I0227 00:24:56.395973 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" event={"ID":"5411c548-900f-4d1e-816d-8687268b6ebc","Type":"ContainerStarted","Data":"705ae1ac3f4cc27e243e281ed7ae1b8b6a29f990c8825db10f1d6ca4d40585cf"} Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.808240 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vdbg9"] Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.841998 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-86z46"] Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.843179 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.858436 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-86z46"] Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.940343 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-dns-svc\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.940406 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-config\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.940436 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gk5b\" (UniqueName: \"kubernetes.io/projected/12142f3c-5849-4af1-8c9e-c92304d3c375-kube-api-access-8gk5b\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.042344 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-dns-svc\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.042398 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-config\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.042423 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gk5b\" (UniqueName: \"kubernetes.io/projected/12142f3c-5849-4af1-8c9e-c92304d3c375-kube-api-access-8gk5b\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.043428 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-dns-svc\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.043951 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-config\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.070394 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gk5b\" (UniqueName: \"kubernetes.io/projected/12142f3c-5849-4af1-8c9e-c92304d3c375-kube-api-access-8gk5b\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.169973 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.210786 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6drvh"] Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.271377 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9h55"] Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.273474 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.280999 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9h55"] Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.354770 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.354825 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw2z9\" (UniqueName: \"kubernetes.io/projected/0f2c76ec-cfab-4f18-b624-722021700885-kube-api-access-fw2z9\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.354870 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-config\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.455901 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.455949 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw2z9\" (UniqueName: \"kubernetes.io/projected/0f2c76ec-cfab-4f18-b624-722021700885-kube-api-access-fw2z9\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.455982 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-config\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.456939 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-config\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.457253 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.480174 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw2z9\" (UniqueName: \"kubernetes.io/projected/0f2c76ec-cfab-4f18-b624-722021700885-kube-api-access-fw2z9\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.639746 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.722138 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-86z46"] Feb 27 00:24:58 crc kubenswrapper[4781]: W0227 00:24:58.732843 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12142f3c_5849_4af1_8c9e_c92304d3c375.slice/crio-719bdff771357569191b4394e9f9c49ad9d8c251f65a73dabcbbf8b3ba8bdaa2 WatchSource:0}: Error finding container 719bdff771357569191b4394e9f9c49ad9d8c251f65a73dabcbbf8b3ba8bdaa2: Status 404 returned error can't find the container with id 719bdff771357569191b4394e9f9c49ad9d8c251f65a73dabcbbf8b3ba8bdaa2 Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.022422 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.024145 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.027756 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jcfdg" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.032268 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.032351 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.032565 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.032644 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.032785 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.033822 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.034793 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.069828 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.069893 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-server-conf\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.069984 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070103 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070149 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-config-data\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070201 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/919ba171-1971-416c-99c1-5dfcacc10a28-pod-info\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070234 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/919ba171-1971-416c-99c1-5dfcacc10a28-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070313 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070367 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf9tq\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-kube-api-access-tf9tq\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070545 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.100760 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9h55"] Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178269 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-config-data\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178307 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/919ba171-1971-416c-99c1-5dfcacc10a28-pod-info\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178342 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/919ba171-1971-416c-99c1-5dfcacc10a28-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178390 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178422 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf9tq\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-kube-api-access-tf9tq\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178891 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178939 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.179229 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.179300 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-server-conf\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.179485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-config-data\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.179937 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.183404 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.183574 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.184005 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.192002 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-server-conf\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.197754 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/919ba171-1971-416c-99c1-5dfcacc10a28-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.197779 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/919ba171-1971-416c-99c1-5dfcacc10a28-pod-info\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.198518 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.198543 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a1416593fd912ec74c6e12871251980e537685bd157bf8eba211fce64d9b048a/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.199949 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.200906 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.206369 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf9tq\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-kube-api-access-tf9tq\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.238724 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.361133 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.408456 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.412309 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417007 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417063 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417091 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417237 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417420 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417549 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417579 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-t6n8b" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.430786 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.453667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-86z46" event={"ID":"12142f3c-5849-4af1-8c9e-c92304d3c375","Type":"ContainerStarted","Data":"719bdff771357569191b4394e9f9c49ad9d8c251f65a73dabcbbf8b3ba8bdaa2"} Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.456750 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" event={"ID":"0f2c76ec-cfab-4f18-b624-722021700885","Type":"ContainerStarted","Data":"893d8f643c3cbe02cd19b59bf3115d432c587df0f05ea410ba0d0253101d7031"} Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488067 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488110 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488141 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488156 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488183 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488205 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488225 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488269 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488285 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488309 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488323 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnr8b\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-kube-api-access-dnr8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.588948 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.588987 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589025 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589053 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589082 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589127 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589143 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589170 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589191 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnr8b\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-kube-api-access-dnr8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589231 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589257 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.590013 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.590121 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.590879 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.591306 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.592094 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.593235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.594199 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.596128 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.597037 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.597069 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a2eaf337fb87b6a71958dbd52c87dbf5c448ea95938dfd82cb1cc22a9e40efc9/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.600130 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.615981 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnr8b\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-kube-api-access-dnr8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.642412 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.757077 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.959321 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.436777 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.439042 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.442348 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.443752 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-mnp2q" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.443917 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.445878 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.451113 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.452844 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508137 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d59d3864-af0d-407c-8431-ae2e17e4b46f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508224 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-config-data-default\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508250 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-kolla-config\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508268 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d59d3864-af0d-407c-8431-ae2e17e4b46f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508287 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrjv2\" (UniqueName: \"kubernetes.io/projected/d59d3864-af0d-407c-8431-ae2e17e4b46f-kube-api-access-hrjv2\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508306 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508325 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59d3864-af0d-407c-8431-ae2e17e4b46f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508399 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609453 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609510 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d59d3864-af0d-407c-8431-ae2e17e4b46f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609537 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-config-data-default\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609556 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-kolla-config\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609575 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d59d3864-af0d-407c-8431-ae2e17e4b46f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609589 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrjv2\" (UniqueName: \"kubernetes.io/projected/d59d3864-af0d-407c-8431-ae2e17e4b46f-kube-api-access-hrjv2\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609608 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609639 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59d3864-af0d-407c-8431-ae2e17e4b46f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.610111 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d59d3864-af0d-407c-8431-ae2e17e4b46f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.610838 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-kolla-config\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.611678 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-config-data-default\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.612032 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.624311 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d59d3864-af0d-407c-8431-ae2e17e4b46f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.629958 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59d3864-af0d-407c-8431-ae2e17e4b46f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.658998 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.659274 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/af56ff13fee24b73e63022e5c2516390dc1057be7bc55daf5aed00165d3047c9/globalmount\"" pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.659086 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrjv2\" (UniqueName: \"kubernetes.io/projected/d59d3864-af0d-407c-8431-ae2e17e4b46f-kube-api-access-hrjv2\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.804261 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.067583 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.914402 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.915591 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.922911 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dn6l5" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.923901 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.924590 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.925350 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938525 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938573 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938611 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938686 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22624edd-e366-4aff-84dd-c3cec89c0591-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938710 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22624edd-e366-4aff-84dd-c3cec89c0591-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938738 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgj9x\" (UniqueName: \"kubernetes.io/projected/22624edd-e366-4aff-84dd-c3cec89c0591-kube-api-access-jgj9x\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938761 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22624edd-e366-4aff-84dd-c3cec89c0591-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938787 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.946190 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.040930 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22624edd-e366-4aff-84dd-c3cec89c0591-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041051 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22624edd-e366-4aff-84dd-c3cec89c0591-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041090 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgj9x\" (UniqueName: \"kubernetes.io/projected/22624edd-e366-4aff-84dd-c3cec89c0591-kube-api-access-jgj9x\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041114 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22624edd-e366-4aff-84dd-c3cec89c0591-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041150 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041208 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041240 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041300 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041748 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22624edd-e366-4aff-84dd-c3cec89c0591-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.042613 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.043797 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.053575 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22624edd-e366-4aff-84dd-c3cec89c0591-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.056026 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.062872 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22624edd-e366-4aff-84dd-c3cec89c0591-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.065100 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgj9x\" (UniqueName: \"kubernetes.io/projected/22624edd-e366-4aff-84dd-c3cec89c0591-kube-api-access-jgj9x\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.065257 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.065284 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2626e085f5eebde1c78ff01a19f2918ba6aad6cd8b70016bc6cf3611ba49beaf/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.111836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.189310 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.190808 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.195421 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.195746 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9jjf8" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.196029 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.202711 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.243997 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06e98c4a-d812-4e42-b95c-d263e49bf5d3-kolla-config\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.244038 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e98c4a-d812-4e42-b95c-d263e49bf5d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.244137 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e98c4a-d812-4e42-b95c-d263e49bf5d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.244162 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e98c4a-d812-4e42-b95c-d263e49bf5d3-config-data\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.244189 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmvrr\" (UniqueName: \"kubernetes.io/projected/06e98c4a-d812-4e42-b95c-d263e49bf5d3-kube-api-access-mmvrr\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.244308 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.345293 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmvrr\" (UniqueName: \"kubernetes.io/projected/06e98c4a-d812-4e42-b95c-d263e49bf5d3-kube-api-access-mmvrr\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.345337 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06e98c4a-d812-4e42-b95c-d263e49bf5d3-kolla-config\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.345357 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e98c4a-d812-4e42-b95c-d263e49bf5d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.345482 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e98c4a-d812-4e42-b95c-d263e49bf5d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.345511 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e98c4a-d812-4e42-b95c-d263e49bf5d3-config-data\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.347478 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06e98c4a-d812-4e42-b95c-d263e49bf5d3-kolla-config\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.349210 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e98c4a-d812-4e42-b95c-d263e49bf5d3-config-data\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.351328 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e98c4a-d812-4e42-b95c-d263e49bf5d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.363501 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e98c4a-d812-4e42-b95c-d263e49bf5d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.366487 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmvrr\" (UniqueName: \"kubernetes.io/projected/06e98c4a-d812-4e42-b95c-d263e49bf5d3-kube-api-access-mmvrr\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.508533 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.372081 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.373180 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.376234 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-nmsfm" Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.383975 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.476970 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr9c5\" (UniqueName: \"kubernetes.io/projected/91997a3e-9e65-4eab-a0b9-8f9c639a8d05-kube-api-access-hr9c5\") pod \"kube-state-metrics-0\" (UID: \"91997a3e-9e65-4eab-a0b9-8f9c639a8d05\") " pod="openstack/kube-state-metrics-0" Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.564513 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"919ba171-1971-416c-99c1-5dfcacc10a28","Type":"ContainerStarted","Data":"f58e1ef93098c46c57b5e59fd849c5fcd9c3a1bc9f7c9d503b32be5e67364d02"} Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.578013 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr9c5\" (UniqueName: \"kubernetes.io/projected/91997a3e-9e65-4eab-a0b9-8f9c639a8d05-kube-api-access-hr9c5\") pod \"kube-state-metrics-0\" (UID: \"91997a3e-9e65-4eab-a0b9-8f9c639a8d05\") " pod="openstack/kube-state-metrics-0" Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.617565 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr9c5\" (UniqueName: \"kubernetes.io/projected/91997a3e-9e65-4eab-a0b9-8f9c639a8d05-kube-api-access-hr9c5\") pod \"kube-state-metrics-0\" (UID: \"91997a3e-9e65-4eab-a0b9-8f9c639a8d05\") " pod="openstack/kube-state-metrics-0" Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.695479 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.003134 4781 scope.go:117] "RemoveContainer" containerID="86bad95d795a7faf37cb19be6e8217786d2cabd57a047f7210f59250bf6bee2f" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.143486 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.145236 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.148496 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.148772 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.152710 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.152739 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-77w49" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.152758 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.216690 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315103 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blz5w\" (UniqueName: \"kubernetes.io/projected/58009056-4183-4017-bfa1-c14ce28b92ea-kube-api-access-blz5w\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315145 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315189 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315212 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315234 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/58009056-4183-4017-bfa1-c14ce28b92ea-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315294 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/58009056-4183-4017-bfa1-c14ce28b92ea-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/58009056-4183-4017-bfa1-c14ce28b92ea-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.416523 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/58009056-4183-4017-bfa1-c14ce28b92ea-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.416569 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/58009056-4183-4017-bfa1-c14ce28b92ea-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.417163 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blz5w\" (UniqueName: \"kubernetes.io/projected/58009056-4183-4017-bfa1-c14ce28b92ea-kube-api-access-blz5w\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.417188 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.417214 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.417247 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.417556 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/58009056-4183-4017-bfa1-c14ce28b92ea-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.418000 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/58009056-4183-4017-bfa1-c14ce28b92ea-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.424386 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/58009056-4183-4017-bfa1-c14ce28b92ea-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.425064 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/58009056-4183-4017-bfa1-c14ce28b92ea-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.425776 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.425961 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.428558 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.458314 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blz5w\" (UniqueName: \"kubernetes.io/projected/58009056-4183-4017-bfa1-c14ce28b92ea-kube-api-access-blz5w\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.461176 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.764916 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.767002 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.769806 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.769884 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.770066 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.770109 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.770202 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-zmzb4" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.770239 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.770313 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.770067 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.795407 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926289 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f85c54b-b800-429a-ba2d-fe22056ac907-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926340 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-config\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926502 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926608 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926672 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2945l\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-kube-api-access-2945l\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926715 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926752 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926872 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926990 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.927024 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029018 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029063 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029099 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f85c54b-b800-429a-ba2d-fe22056ac907-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029121 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-config\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029158 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029190 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029210 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2945l\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-kube-api-access-2945l\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029239 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029259 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029285 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.030927 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.030956 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.030998 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.034113 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.034257 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.034343 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.034584 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-config\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.034746 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f85c54b-b800-429a-ba2d-fe22056ac907-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.036127 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.036181 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b26095f48a6799aae7472dc34ad76c7f8559a3fa84033df1f18203d2595242ed/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.053728 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2945l\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-kube-api-access-2945l\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.072874 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.094479 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.958102 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-hcb9s"] Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.960242 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.962494 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.962841 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rxqbb" Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.977259 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hcb9s"] Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.987874 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9zkpb"] Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.989062 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.030417 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.048609 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9zkpb"] Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.060941 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-run\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.060997 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-log\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.061024 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrnx8\" (UniqueName: \"kubernetes.io/projected/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-kube-api-access-vrnx8\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.061059 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-lib\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.061110 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-scripts\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.061135 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-etc-ovs\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162397 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrnx8\" (UniqueName: \"kubernetes.io/projected/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-kube-api-access-vrnx8\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162457 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/092921e0-a033-4021-b0f5-9c89de3aa830-scripts\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162478 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-lib\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162498 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092921e0-a033-4021-b0f5-9c89de3aa830-combined-ca-bundle\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162528 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/092921e0-a033-4021-b0f5-9c89de3aa830-ovn-controller-tls-certs\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-scripts\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162685 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-run\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162760 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-etc-ovs\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162935 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj29s\" (UniqueName: \"kubernetes.io/projected/092921e0-a033-4021-b0f5-9c89de3aa830-kube-api-access-rj29s\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162996 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-run\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162998 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-lib\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.163042 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-log-ovn\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.163066 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-etc-ovs\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.163070 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-run-ovn\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.163137 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-log\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.163137 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-run\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.163237 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-log\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.164695 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-scripts\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.193175 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrnx8\" (UniqueName: \"kubernetes.io/projected/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-kube-api-access-vrnx8\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.254473 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.255885 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.259776 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.259789 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-m8pgd" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.260154 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.260156 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.260793 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264093 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-log-ovn\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264122 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-run-ovn\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264169 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/092921e0-a033-4021-b0f5-9c89de3aa830-scripts\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264191 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092921e0-a033-4021-b0f5-9c89de3aa830-combined-ca-bundle\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264220 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/092921e0-a033-4021-b0f5-9c89de3aa830-ovn-controller-tls-certs\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264258 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-run\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264310 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj29s\" (UniqueName: \"kubernetes.io/projected/092921e0-a033-4021-b0f5-9c89de3aa830-kube-api-access-rj29s\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-log-ovn\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264498 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-run-ovn\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264744 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-run\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.266432 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/092921e0-a033-4021-b0f5-9c89de3aa830-scripts\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.269239 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092921e0-a033-4021-b0f5-9c89de3aa830-combined-ca-bundle\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.271462 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/092921e0-a033-4021-b0f5-9c89de3aa830-ovn-controller-tls-certs\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.274586 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.281473 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.310097 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj29s\" (UniqueName: \"kubernetes.io/projected/092921e0-a033-4021-b0f5-9c89de3aa830-kube-api-access-rj29s\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.350049 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366525 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366570 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd103c67-d035-4de1-aba9-667d1eb67813-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366594 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366618 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366668 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdtk4\" (UniqueName: \"kubernetes.io/projected/bd103c67-d035-4de1-aba9-667d1eb67813-kube-api-access-vdtk4\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366686 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd103c67-d035-4de1-aba9-667d1eb67813-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366705 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd103c67-d035-4de1-aba9-667d1eb67813-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366747 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468619 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468683 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdtk4\" (UniqueName: \"kubernetes.io/projected/bd103c67-d035-4de1-aba9-667d1eb67813-kube-api-access-vdtk4\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468715 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd103c67-d035-4de1-aba9-667d1eb67813-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468735 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd103c67-d035-4de1-aba9-667d1eb67813-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468779 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468832 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468859 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd103c67-d035-4de1-aba9-667d1eb67813-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.469687 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd103c67-d035-4de1-aba9-667d1eb67813-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.470983 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd103c67-d035-4de1-aba9-667d1eb67813-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.473148 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd103c67-d035-4de1-aba9-667d1eb67813-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.474342 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.476941 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.476969 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7d22a4830c7704a84644fe2448f23da2d221b5f43f3bfc24558389697e21a972/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.477336 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.477473 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.489558 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdtk4\" (UniqueName: \"kubernetes.io/projected/bd103c67-d035-4de1-aba9-667d1eb67813-kube-api-access-vdtk4\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.547512 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.675379 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.853338 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.856572 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.860585 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.860771 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.860843 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.860965 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-4hx8w" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.861182 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946446 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946500 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946526 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946565 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946599 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946612 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946646 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-config\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946664 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4ccq\" (UniqueName: \"kubernetes.io/projected/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-kube-api-access-s4ccq\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048136 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048202 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048236 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048281 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048325 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048347 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048371 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4ccq\" (UniqueName: \"kubernetes.io/projected/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-kube-api-access-s4ccq\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048393 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-config\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048698 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.049572 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.051361 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-config\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.052311 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.052347 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c97dcfd59732a972091dce3593a8b31fc374fcdbce6ba9daf533c75d52555044/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.054497 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.057339 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.059133 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.068889 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4ccq\" (UniqueName: \"kubernetes.io/projected/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-kube-api-access-s4ccq\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.090870 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.173534 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.441049 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.446950 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.486089 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.486432 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.486657 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-dnw9v" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.486859 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.487409 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.493067 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.604746 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njkb6\" (UniqueName: \"kubernetes.io/projected/a5170e93-09e9-40d2-ac65-b87d44ceb185-kube-api-access-njkb6\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.604835 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5170e93-09e9-40d2-ac65-b87d44ceb185-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.604968 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.604994 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.605017 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.618984 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.623251 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.626349 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.626681 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.626842 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.641275 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.703428 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.704648 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708450 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njkb6\" (UniqueName: \"kubernetes.io/projected/a5170e93-09e9-40d2-ac65-b87d44ceb185-kube-api-access-njkb6\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708493 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71cee9c-2288-4843-ab71-0720c8527073-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708554 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708587 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5170e93-09e9-40d2-ac65-b87d44ceb185-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708665 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708701 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708721 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708737 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrtqr\" (UniqueName: \"kubernetes.io/projected/d71cee9c-2288-4843-ab71-0720c8527073-kube-api-access-rrtqr\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708754 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708778 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.709664 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.711074 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5170e93-09e9-40d2-ac65-b87d44ceb185-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.720040 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.725540 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.732144 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.734840 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.735109 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njkb6\" (UniqueName: \"kubernetes.io/projected/a5170e93-09e9-40d2-ac65-b87d44ceb185-kube-api-access-njkb6\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.749786 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810004 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810083 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71cee9c-2288-4843-ab71-0720c8527073-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810167 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810196 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810279 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e3acc2-cee4-4bfe-af04-3a64041fc327-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810513 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810664 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810776 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml94h\" (UniqueName: \"kubernetes.io/projected/d9e3acc2-cee4-4bfe-af04-3a64041fc327-kube-api-access-ml94h\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810906 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.811007 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrtqr\" (UniqueName: \"kubernetes.io/projected/d71cee9c-2288-4843-ab71-0720c8527073-kube-api-access-rrtqr\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.811116 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.811463 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71cee9c-2288-4843-ab71-0720c8527073-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.812237 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.817013 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.822373 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.824458 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.827390 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.837836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrtqr\" (UniqueName: \"kubernetes.io/projected/d71cee9c-2288-4843-ab71-0720c8527073-kube-api-access-rrtqr\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.868314 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.869800 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878205 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878237 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878375 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878445 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878558 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878731 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878834 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.880431 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.884135 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-vrztz" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.884711 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.890005 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.912291 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.912726 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e3acc2-cee4-4bfe-af04-3a64041fc327-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.912844 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.912990 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml94h\" (UniqueName: \"kubernetes.io/projected/d9e3acc2-cee4-4bfe-af04-3a64041fc327-kube-api-access-ml94h\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.913147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.913795 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.914242 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e3acc2-cee4-4bfe-af04-3a64041fc327-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.920547 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.931004 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.931910 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml94h\" (UniqueName: \"kubernetes.io/projected/d9e3acc2-cee4-4bfe-af04-3a64041fc327-kube-api-access-ml94h\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.943600 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.014699 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.014849 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.014885 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.014923 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015002 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015083 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015111 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015140 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015158 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015238 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015267 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015287 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015323 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015342 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp2qg\" (UniqueName: \"kubernetes.io/projected/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-kube-api-access-sp2qg\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015382 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015403 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015428 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015446 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glsph\" (UniqueName: \"kubernetes.io/projected/233250c8-3871-43ec-8c1d-47bd1d3133e1-kube-api-access-glsph\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.101743 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116316 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116355 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116379 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116408 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116443 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116460 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116972 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116997 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117025 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117043 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117062 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117084 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117103 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp2qg\" (UniqueName: \"kubernetes.io/projected/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-kube-api-access-sp2qg\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117126 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117142 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117163 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117178 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glsph\" (UniqueName: \"kubernetes.io/projected/233250c8-3871-43ec-8c1d-47bd1d3133e1-kube-api-access-glsph\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117214 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117620 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117892 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.118512 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.118647 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.118662 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.118715 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.118914 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.119002 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.119051 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.119122 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.120817 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.121015 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.121537 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.122180 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.122490 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.124967 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.137230 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glsph\" (UniqueName: \"kubernetes.io/projected/233250c8-3871-43ec-8c1d-47bd1d3133e1-kube-api-access-glsph\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.138972 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp2qg\" (UniqueName: \"kubernetes.io/projected/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-kube-api-access-sp2qg\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.203806 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.252616 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.608137 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.610665 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.616290 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.616570 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.628038 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.672694 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.676702 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.680267 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.680421 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.687221 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726575 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726657 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhhw9\" (UniqueName: \"kubernetes.io/projected/2691e066-2f4c-4e7e-bcac-01933bd6cadb-kube-api-access-bhhw9\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726688 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2691e066-2f4c-4e7e-bcac-01933bd6cadb-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726719 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726765 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726790 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726942 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.727162 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.758838 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.760602 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.763270 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.765490 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.768718 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829268 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829313 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn45d\" (UniqueName: \"kubernetes.io/projected/42503ae1-b143-45c3-8789-e2d1f72cc335-kube-api-access-dn45d\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829338 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829363 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829394 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829418 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829438 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829458 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42503ae1-b143-45c3-8789-e2d1f72cc335-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829485 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829504 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829525 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829554 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/684ccdab-ae41-466c-bf47-78c3ada41164-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829585 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp4hx\" (UniqueName: \"kubernetes.io/projected/684ccdab-ae41-466c-bf47-78c3ada41164-kube-api-access-rp4hx\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829621 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829667 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829690 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829738 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829755 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829791 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhhw9\" (UniqueName: \"kubernetes.io/projected/2691e066-2f4c-4e7e-bcac-01933bd6cadb-kube-api-access-bhhw9\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829818 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829836 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2691e066-2f4c-4e7e-bcac-01933bd6cadb-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.831020 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.831123 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2691e066-2f4c-4e7e-bcac-01933bd6cadb-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.831133 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.831233 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.836951 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.837390 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.837840 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.848034 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhhw9\" (UniqueName: \"kubernetes.io/projected/2691e066-2f4c-4e7e-bcac-01933bd6cadb-kube-api-access-bhhw9\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.861869 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.867320 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.930962 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42503ae1-b143-45c3-8789-e2d1f72cc335-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.931263 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.931897 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42503ae1-b143-45c3-8789-e2d1f72cc335-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932300 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932353 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932396 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/684ccdab-ae41-466c-bf47-78c3ada41164-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932441 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp4hx\" (UniqueName: \"kubernetes.io/projected/684ccdab-ae41-466c-bf47-78c3ada41164-kube-api-access-rp4hx\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932492 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932529 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932561 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932650 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932804 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932830 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn45d\" (UniqueName: \"kubernetes.io/projected/42503ae1-b143-45c3-8789-e2d1f72cc335-kube-api-access-dn45d\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932867 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932961 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.933001 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.933968 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.934017 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/684ccdab-ae41-466c-bf47-78c3ada41164-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.934196 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.937042 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.939894 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.943061 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.945959 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.949331 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.950438 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.950892 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.952671 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.954917 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn45d\" (UniqueName: \"kubernetes.io/projected/42503ae1-b143-45c3-8789-e2d1f72cc335-kube-api-access-dn45d\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.956881 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp4hx\" (UniqueName: \"kubernetes.io/projected/684ccdab-ae41-466c-bf47-78c3ada41164-kube-api-access-rp4hx\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.967619 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.970440 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:17 crc kubenswrapper[4781]: I0227 00:25:17.006580 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:17 crc kubenswrapper[4781]: I0227 00:25:17.100364 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.408497 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.409142 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fltx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-vdbg9_openstack(c73aa093-0d39-41f9-a0bd-35e621c4cf8c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.410366 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" podUID="c73aa093-0d39-41f9-a0bd-35e621c4cf8c" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.488407 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.488567 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8z4t2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-6drvh_openstack(5411c548-900f-4d1e-816d-8687268b6ebc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.490272 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" podUID="5411c548-900f-4d1e-816d-8687268b6ebc" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.497065 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.497252 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8gk5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-86z46_openstack(12142f3c-5849-4af1-8c9e-c92304d3c375): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.506331 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.506481 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fw2z9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-f9h55_openstack(0f2c76ec-cfab-4f18-b624-722021700885): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.508176 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-86z46" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.509688 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" podUID="0f2c76ec-cfab-4f18-b624-722021700885" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.785403 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-86z46" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.785665 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" podUID="0f2c76ec-cfab-4f18-b624-722021700885" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.112784 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.118808 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.452063 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:25:18 crc kubenswrapper[4781]: W0227 00:25:18.455973 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22624edd_e366_4aff_84dd_c3cec89c0591.slice/crio-fc122a09a1ad822388f597d9278445e9c9b6bce2b48664dd2b6b7c6b7437eac4 WatchSource:0}: Error finding container fc122a09a1ad822388f597d9278445e9c9b6bce2b48664dd2b6b7c6b7437eac4: Status 404 returned error can't find the container with id fc122a09a1ad822388f597d9278445e9c9b6bce2b48664dd2b6b7c6b7437eac4 Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.462097 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.471182 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.571616 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z4t2\" (UniqueName: \"kubernetes.io/projected/5411c548-900f-4d1e-816d-8687268b6ebc-kube-api-access-8z4t2\") pod \"5411c548-900f-4d1e-816d-8687268b6ebc\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.571702 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-dns-svc\") pod \"5411c548-900f-4d1e-816d-8687268b6ebc\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.571756 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-config\") pod \"5411c548-900f-4d1e-816d-8687268b6ebc\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.571799 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fltx\" (UniqueName: \"kubernetes.io/projected/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-kube-api-access-8fltx\") pod \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.571832 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-config\") pod \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.572401 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-config" (OuterVolumeSpecName: "config") pod "5411c548-900f-4d1e-816d-8687268b6ebc" (UID: "5411c548-900f-4d1e-816d-8687268b6ebc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.572496 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-config" (OuterVolumeSpecName: "config") pod "c73aa093-0d39-41f9-a0bd-35e621c4cf8c" (UID: "c73aa093-0d39-41f9-a0bd-35e621c4cf8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.572578 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5411c548-900f-4d1e-816d-8687268b6ebc" (UID: "5411c548-900f-4d1e-816d-8687268b6ebc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.573048 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.573067 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.573077 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.576766 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-kube-api-access-8fltx" (OuterVolumeSpecName: "kube-api-access-8fltx") pod "c73aa093-0d39-41f9-a0bd-35e621c4cf8c" (UID: "c73aa093-0d39-41f9-a0bd-35e621c4cf8c"). InnerVolumeSpecName "kube-api-access-8fltx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.578923 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5411c548-900f-4d1e-816d-8687268b6ebc-kube-api-access-8z4t2" (OuterVolumeSpecName: "kube-api-access-8z4t2") pod "5411c548-900f-4d1e-816d-8687268b6ebc" (UID: "5411c548-900f-4d1e-816d-8687268b6ebc"). InnerVolumeSpecName "kube-api-access-8z4t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:25:18 crc kubenswrapper[4781]: W0227 00:25:18.664111 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7ca2a9f_a42e_4d9b_89a7_f2590842f328.slice/crio-231128a0e69346808037ae68c6b271f51f365db9ad7d7761da0f5c52d5d3f07d WatchSource:0}: Error finding container 231128a0e69346808037ae68c6b271f51f365db9ad7d7761da0f5c52d5d3f07d: Status 404 returned error can't find the container with id 231128a0e69346808037ae68c6b271f51f365db9ad7d7761da0f5c52d5d3f07d Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.674617 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.674978 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z4t2\" (UniqueName: \"kubernetes.io/projected/5411c548-900f-4d1e-816d-8687268b6ebc-kube-api-access-8z4t2\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.675025 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fltx\" (UniqueName: \"kubernetes.io/projected/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-kube-api-access-8fltx\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.752922 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"06e98c4a-d812-4e42-b95c-d263e49bf5d3","Type":"ContainerStarted","Data":"c5464028f7513fcdae391f484a3a38c5d42febbb578693bd7eafd9d1eac440ff"} Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.754392 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c7ca2a9f-a42e-4d9b-89a7-f2590842f328","Type":"ContainerStarted","Data":"231128a0e69346808037ae68c6b271f51f365db9ad7d7761da0f5c52d5d3f07d"} Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.755483 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d59d3864-af0d-407c-8431-ae2e17e4b46f","Type":"ContainerStarted","Data":"6b8f7a80085810149f097d16464405f2d3e453688708d9148af3e4e13a432507"} Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.756406 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" event={"ID":"c73aa093-0d39-41f9-a0bd-35e621c4cf8c","Type":"ContainerDied","Data":"060eb712a24d9ca196230b121a82b836df7e85841645aaea05dcf490cac6e106"} Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.756479 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.763968 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.764019 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" event={"ID":"5411c548-900f-4d1e-816d-8687268b6ebc","Type":"ContainerDied","Data":"705ae1ac3f4cc27e243e281ed7ae1b8b6a29f990c8825db10f1d6ca4d40585cf"} Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.772201 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"22624edd-e366-4aff-84dd-c3cec89c0591","Type":"ContainerStarted","Data":"fc122a09a1ad822388f597d9278445e9c9b6bce2b48664dd2b6b7c6b7437eac4"} Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.838071 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.845191 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.851591 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9zkpb"] Feb 27 00:25:18 crc kubenswrapper[4781]: W0227 00:25:18.856088 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91997a3e_9e65_4eab_a0b9_8f9c639a8d05.slice/crio-40455368adae6ae08a72863a733dea4cab1b575394a61b8c7f6b3e11518f1446 WatchSource:0}: Error finding container 40455368adae6ae08a72863a733dea4cab1b575394a61b8c7f6b3e11518f1446: Status 404 returned error can't find the container with id 40455368adae6ae08a72863a733dea4cab1b575394a61b8c7f6b3e11518f1446 Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.859559 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: W0227 00:25:18.866041 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58009056_4183_4017_bfa1_c14ce28b92ea.slice/crio-493c28f23519740c1b7fbcc8f895aa01b0fc47933b8169023d997be516c2a502 WatchSource:0}: Error finding container 493c28f23519740c1b7fbcc8f895aa01b0fc47933b8169023d997be516c2a502: Status 404 returned error can't find the container with id 493c28f23519740c1b7fbcc8f895aa01b0fc47933b8169023d997be516c2a502 Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.995816 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: W0227 00:25:18.998984 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd103c67_d035_4de1_aba9_667d1eb67813.slice/crio-3182ebf2704dccde26ab7abd6d192671846f9105b6a3740be2e207d2d08e2243 WatchSource:0}: Error finding container 3182ebf2704dccde26ab7abd6d192671846f9105b6a3740be2e207d2d08e2243: Status 404 returned error can't find the container with id 3182ebf2704dccde26ab7abd6d192671846f9105b6a3740be2e207d2d08e2243 Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.130163 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vdbg9"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.137270 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vdbg9"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.153889 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6drvh"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.160373 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6drvh"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.193943 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4"] Feb 27 00:25:19 crc kubenswrapper[4781]: W0227 00:25:19.205461 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd71cee9c_2288_4843_ab71_0720c8527073.slice/crio-cda482192b4e511d4ceb1a90ce18a6df940ec845cf4be5912926c4014eb0c77c WatchSource:0}: Error finding container cda482192b4e511d4ceb1a90ce18a6df940ec845cf4be5912926c4014eb0c77c: Status 404 returned error can't find the container with id cda482192b4e511d4ceb1a90ce18a6df940ec845cf4be5912926c4014eb0c77c Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.322419 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5411c548-900f-4d1e-816d-8687268b6ebc" path="/var/lib/kubelet/pods/5411c548-900f-4d1e-816d-8687268b6ebc/volumes" Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.322835 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c73aa093-0d39-41f9-a0bd-35e621c4cf8c" path="/var/lib/kubelet/pods/c73aa093-0d39-41f9-a0bd-35e621c4cf8c/volumes" Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.397783 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.410868 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.421526 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf"] Feb 27 00:25:19 crc kubenswrapper[4781]: W0227 00:25:19.437215 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5170e93_09e9_40d2_ac65_b87d44ceb185.slice/crio-688b627a224232d5a2d9839ff3427637e8527398cb09527e50a702a1895864c0 WatchSource:0}: Error finding container 688b627a224232d5a2d9839ff3427637e8527398cb09527e50a702a1895864c0: Status 404 returned error can't find the container with id 688b627a224232d5a2d9839ff3427637e8527398cb09527e50a702a1895864c0 Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.439908 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.482938 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.503807 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.511365 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.794017 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"684ccdab-ae41-466c-bf47-78c3ada41164","Type":"ContainerStarted","Data":"c00c80bfccef04e1fc565a7c44c1a1deb9ddb678c0b87849935b93fed5452439"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.795535 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerStarted","Data":"2ad75abe5f1e9859dec62d9d7e1f0e4f7552fc881d371d2d01763329d31bdef8"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.797823 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" event={"ID":"d71cee9c-2288-4843-ab71-0720c8527073","Type":"ContainerStarted","Data":"cda482192b4e511d4ceb1a90ce18a6df940ec845cf4be5912926c4014eb0c77c"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.799074 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"2691e066-2f4c-4e7e-bcac-01933bd6cadb","Type":"ContainerStarted","Data":"6bafacaf9526831ff083e02e39024ee90f76bc912035296c37f6794c2d351566"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.800164 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"58009056-4183-4017-bfa1-c14ce28b92ea","Type":"ContainerStarted","Data":"493c28f23519740c1b7fbcc8f895aa01b0fc47933b8169023d997be516c2a502"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.801585 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"42503ae1-b143-45c3-8789-e2d1f72cc335","Type":"ContainerStarted","Data":"f3ca9f27b392ca0e92f0701d7b70193de66a094d153f08887bc1e331065b719f"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.802804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb" event={"ID":"092921e0-a033-4021-b0f5-9c89de3aa830","Type":"ContainerStarted","Data":"cebfceb5ae1ea44437777c92aec223fc929f53d637b67e35eade18f05d4d3c1c"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.804040 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd103c67-d035-4de1-aba9-667d1eb67813","Type":"ContainerStarted","Data":"3182ebf2704dccde26ab7abd6d192671846f9105b6a3740be2e207d2d08e2243"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.805187 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" event={"ID":"877c39ec-0202-4987-b6e7-4fb90c4dc9b5","Type":"ContainerStarted","Data":"33f26099ff4150504b0c6d4ea4dfaf4293bf0feaa48dbc817e863e2d58ba5f60"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.806818 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" event={"ID":"233250c8-3871-43ec-8c1d-47bd1d3133e1","Type":"ContainerStarted","Data":"7b2d54735183971b72c72a7da74b00000e7d041b9f935da739b6b6ec312eac0c"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.808107 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"91997a3e-9e65-4eab-a0b9-8f9c639a8d05","Type":"ContainerStarted","Data":"40455368adae6ae08a72863a733dea4cab1b575394a61b8c7f6b3e11518f1446"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.811407 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"919ba171-1971-416c-99c1-5dfcacc10a28","Type":"ContainerStarted","Data":"96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.814332 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" event={"ID":"a5170e93-09e9-40d2-ac65-b87d44ceb185","Type":"ContainerStarted","Data":"688b627a224232d5a2d9839ff3427637e8527398cb09527e50a702a1895864c0"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.816785 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" event={"ID":"d9e3acc2-cee4-4bfe-af04-3a64041fc327","Type":"ContainerStarted","Data":"86208fb43402d857d08f4221e080cb603cac9014d9ef8af1aacc6e1db6fc5d11"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.910844 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hcb9s"] Feb 27 00:25:20 crc kubenswrapper[4781]: I0227 00:25:20.234853 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 27 00:25:20 crc kubenswrapper[4781]: I0227 00:25:20.826242 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c7ca2a9f-a42e-4d9b-89a7-f2590842f328","Type":"ContainerStarted","Data":"592b25e10dba92f06ec6db612c25fdc12d9afc456496a972e547225b9ac93f91"} Feb 27 00:25:20 crc kubenswrapper[4781]: I0227 00:25:20.828491 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hcb9s" event={"ID":"9c2c498e-52b1-4ee2-bcf8-3599ee89513c","Type":"ContainerStarted","Data":"9140f071fe8e777af99e3a3ff97717072ad39af1f26192169806802a3ca79168"} Feb 27 00:25:21 crc kubenswrapper[4781]: I0227 00:25:21.837752 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7d499c77-ccba-41d1-9efb-8424fc7e8d0e","Type":"ContainerStarted","Data":"d0ecf3d712f99c7845d7b07b68ef8e7492fbd51a59105297fe43602417b3296c"} Feb 27 00:25:33 crc kubenswrapper[4781]: E0227 00:25:33.909129 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981" Feb 27 00:25:33 crc kubenswrapper[4781]: E0227 00:25:33.909833 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-compactor,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=compactor -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dn45d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-compactor-0_openstack(42503ae1-b143-45c3-8789-e2d1f72cc335): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:25:33 crc kubenswrapper[4781]: E0227 00:25:33.911057 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="42503ae1-b143-45c3-8789-e2d1f72cc335" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.088674 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.088911 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-query-frontend,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=query-frontend -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-query-frontend-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-query-frontend-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ml94h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f_openstack(d9e3acc2-cee4-4bfe-af04-3a64041fc327): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.090075 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-query-frontend\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" podUID="d9e3acc2-cee4-4bfe-af04-3a64041fc327" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.280084 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="42503ae1-b143-45c3-8789-e2d1f72cc335" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.280164 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-query-frontend\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" podUID="d9e3acc2-cee4-4bfe-af04-3a64041fc327" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.671155 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.671351 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sp2qg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-7f8685b49f-bxttl_openstack(877c39ec-0202-4987-b6e7-4fb90c4dc9b5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.672682 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" podUID="877c39ec-0202-4987-b6e7-4fb90c4dc9b5" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.011699 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.011904 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n6h65ch588h587h5c8h5b8h576hcch654h5dchb8h68h6bh65fh569h555hddh547h5hd8h5c4h68fh68ch687h99h5f9h5f9h587h57dhd4h5dbhd8q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vdtk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(bd103c67-d035-4de1-aba9-667d1eb67813): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.232811 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.233089 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc4h64fh576h5bdh58ch667h546h664h5fh579h58bhbdh84hd7hb8h554h5c9h74h67ch8ch5cfh566h5b9hd8h89h58bh654h96h5c8h5f8h5d5h59fq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rj29s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-9zkpb_openstack(092921e0-a033-4021-b0f5-9c89de3aa830): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.235169 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-9zkpb" podUID="092921e0-a033-4021-b0f5-9c89de3aa830" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.245739 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.245931 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-glsph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-7f8685b49f-mj87x_openstack(233250c8-3871-43ec-8c1d-47bd1d3133e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.247138 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" podUID="233250c8-3871-43ec-8c1d-47bd1d3133e1" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.292798 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-9zkpb" podUID="092921e0-a033-4021-b0f5-9c89de3aa830" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.292972 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" podUID="877c39ec-0202-4987-b6e7-4fb90c4dc9b5" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.292983 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" podUID="233250c8-3871-43ec-8c1d-47bd1d3133e1" Feb 27 00:25:36 crc kubenswrapper[4781]: E0227 00:25:36.340541 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 27 00:25:36 crc kubenswrapper[4781]: E0227 00:25:36.340885 4781 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 27 00:25:36 crc kubenswrapper[4781]: E0227 00:25:36.341006 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hr9c5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(91997a3e-9e65-4eab-a0b9-8f9c639a8d05): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:25:36 crc kubenswrapper[4781]: E0227 00:25:36.342152 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" Feb 27 00:25:37 crc kubenswrapper[4781]: E0227 00:25:37.315863 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" Feb 27 00:25:38 crc kubenswrapper[4781]: E0227 00:25:38.041003 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="bd103c67-d035-4de1-aba9-667d1eb67813" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.323939 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d59d3864-af0d-407c-8431-ae2e17e4b46f","Type":"ContainerStarted","Data":"5bbf5b24d4d9008f5b75e4c0fa41c61058490ccdd91cacafc1c8d754b500b1d9"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.325888 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"22624edd-e366-4aff-84dd-c3cec89c0591","Type":"ContainerStarted","Data":"560bdbc40641b47d07958de312ae75dadef9033fa90a1674b4544f472b537552"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.328307 4781 generic.go:334] "Generic (PLEG): container finished" podID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerID="27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43" exitCode=0 Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.328340 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-86z46" event={"ID":"12142f3c-5849-4af1-8c9e-c92304d3c375","Type":"ContainerDied","Data":"27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.329704 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" event={"ID":"a5170e93-09e9-40d2-ac65-b87d44ceb185","Type":"ContainerStarted","Data":"b77bd5c6b95e65b09b6b96f9bf95bd4d4fbe2391eda0dbf3e303eee73cece604"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.329824 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.334367 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hcb9s" event={"ID":"9c2c498e-52b1-4ee2-bcf8-3599ee89513c","Type":"ContainerStarted","Data":"d640ffe0e868584caf90344e7b76246aa7b432b903e1a81afcd65d5f36b832bf"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.336307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7d499c77-ccba-41d1-9efb-8424fc7e8d0e","Type":"ContainerStarted","Data":"237ed7dac6f99d58c104a037ad50883c6586975e333239c944c324f9de747ddb"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.338981 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" event={"ID":"d71cee9c-2288-4843-ab71-0720c8527073","Type":"ContainerStarted","Data":"ec8b42000edf747a02b6a4cf8c4141b9c4d8a4ac9fcd17ee9c7836c6dfe6b2c1"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.339142 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.340672 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"06e98c4a-d812-4e42-b95c-d263e49bf5d3","Type":"ContainerStarted","Data":"754acbf64e09afaf10eb0f4a70a3d85f343afd74ee94f34a891dcbfb091a16b1"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.340803 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.343347 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"2691e066-2f4c-4e7e-bcac-01933bd6cadb","Type":"ContainerStarted","Data":"a72d3ac6b8270c1de5ecd0dbf7eed61277df81168d435784e3cced07cdd620d9"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.343804 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.357391 4781 generic.go:334] "Generic (PLEG): container finished" podID="0f2c76ec-cfab-4f18-b624-722021700885" containerID="2b13d210b4d0aa474faa06d53125b956502bdd360564f39dba668453c12f7cac" exitCode=0 Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.357660 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" event={"ID":"0f2c76ec-cfab-4f18-b624-722021700885","Type":"ContainerDied","Data":"2b13d210b4d0aa474faa06d53125b956502bdd360564f39dba668453c12f7cac"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.360478 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"684ccdab-ae41-466c-bf47-78c3ada41164","Type":"ContainerStarted","Data":"7b9134bf688881cccad49f93b644492fdb6b4260e2b8b5acedfa0c1b7c24253e"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.361300 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.362755 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd103c67-d035-4de1-aba9-667d1eb67813","Type":"ContainerStarted","Data":"6e0f680e79bb8afab38cf17b90ba7154adcea3356785931d4b932c63706aaed5"} Feb 27 00:25:38 crc kubenswrapper[4781]: E0227 00:25:38.364439 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="bd103c67-d035-4de1-aba9-667d1eb67813" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.381141 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" podStartSLOduration=6.938862835 podStartE2EDuration="23.381122412s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.216268503 +0000 UTC m=+1188.473808067" lastFinishedPulling="2026-02-27 00:25:35.65852808 +0000 UTC m=+1204.916067644" observedRunningTime="2026-02-27 00:25:38.37723522 +0000 UTC m=+1207.634774804" watchObservedRunningTime="2026-02-27 00:25:38.381122412 +0000 UTC m=+1207.638661976" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.440773 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" podStartSLOduration=7.087853876 podStartE2EDuration="23.440755987s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.445701094 +0000 UTC m=+1188.703240648" lastFinishedPulling="2026-02-27 00:25:35.798603205 +0000 UTC m=+1205.056142759" observedRunningTime="2026-02-27 00:25:38.418983626 +0000 UTC m=+1207.676523180" watchObservedRunningTime="2026-02-27 00:25:38.440755987 +0000 UTC m=+1207.698295541" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.464997 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.418143513 podStartE2EDuration="36.464977623s" podCreationTimestamp="2026-02-27 00:25:02 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.186623605 +0000 UTC m=+1187.444163159" lastFinishedPulling="2026-02-27 00:25:35.233457715 +0000 UTC m=+1204.490997269" observedRunningTime="2026-02-27 00:25:38.4389811 +0000 UTC m=+1207.696520654" watchObservedRunningTime="2026-02-27 00:25:38.464977623 +0000 UTC m=+1207.722517177" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.514241 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=6.919902829 podStartE2EDuration="23.514215965s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.442141521 +0000 UTC m=+1188.699681075" lastFinishedPulling="2026-02-27 00:25:36.036454657 +0000 UTC m=+1205.293994211" observedRunningTime="2026-02-27 00:25:38.510366934 +0000 UTC m=+1207.767906528" watchObservedRunningTime="2026-02-27 00:25:38.514215965 +0000 UTC m=+1207.771755519" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.544405 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=7.63373501 podStartE2EDuration="23.544186221s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.446663929 +0000 UTC m=+1188.704203483" lastFinishedPulling="2026-02-27 00:25:35.35711514 +0000 UTC m=+1204.614654694" observedRunningTime="2026-02-27 00:25:38.542146688 +0000 UTC m=+1207.799686252" watchObservedRunningTime="2026-02-27 00:25:38.544186221 +0000 UTC m=+1207.801725775" Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.375902 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-86z46" event={"ID":"12142f3c-5849-4af1-8c9e-c92304d3c375","Type":"ContainerStarted","Data":"60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0"} Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.376650 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.378493 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" event={"ID":"0f2c76ec-cfab-4f18-b624-722021700885","Type":"ContainerStarted","Data":"b86f56bb626afb20727e0941782a4964e5b9d6b09cdb0c43236b0144e5335336"} Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.378843 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.381071 4781 generic.go:334] "Generic (PLEG): container finished" podID="9c2c498e-52b1-4ee2-bcf8-3599ee89513c" containerID="d640ffe0e868584caf90344e7b76246aa7b432b903e1a81afcd65d5f36b832bf" exitCode=0 Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.381140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hcb9s" event={"ID":"9c2c498e-52b1-4ee2-bcf8-3599ee89513c","Type":"ContainerDied","Data":"d640ffe0e868584caf90344e7b76246aa7b432b903e1a81afcd65d5f36b832bf"} Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.385448 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7d499c77-ccba-41d1-9efb-8424fc7e8d0e","Type":"ContainerStarted","Data":"f666762f8af0c431e8b4e9973700280f9534a62e3f6cf2c22edee444abe0c23b"} Feb 27 00:25:39 crc kubenswrapper[4781]: E0227 00:25:39.388834 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="bd103c67-d035-4de1-aba9-667d1eb67813" Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.417660 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-86z46" podStartSLOduration=4.207561082 podStartE2EDuration="42.417608801s" podCreationTimestamp="2026-02-27 00:24:57 +0000 UTC" firstStartedPulling="2026-02-27 00:24:58.735313798 +0000 UTC m=+1167.992853352" lastFinishedPulling="2026-02-27 00:25:36.945361517 +0000 UTC m=+1206.202901071" observedRunningTime="2026-02-27 00:25:39.4061478 +0000 UTC m=+1208.663687344" watchObservedRunningTime="2026-02-27 00:25:39.417608801 +0000 UTC m=+1208.675148365" Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.432459 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.111983561 podStartE2EDuration="28.43243731s" podCreationTimestamp="2026-02-27 00:25:11 +0000 UTC" firstStartedPulling="2026-02-27 00:25:21.71607066 +0000 UTC m=+1190.973610214" lastFinishedPulling="2026-02-27 00:25:36.036524409 +0000 UTC m=+1205.294063963" observedRunningTime="2026-02-27 00:25:39.427379797 +0000 UTC m=+1208.684919351" watchObservedRunningTime="2026-02-27 00:25:39.43243731 +0000 UTC m=+1208.689976864" Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.503774 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" podStartSLOduration=4.287085078 podStartE2EDuration="41.503759681s" podCreationTimestamp="2026-02-27 00:24:58 +0000 UTC" firstStartedPulling="2026-02-27 00:24:59.112072484 +0000 UTC m=+1168.369612038" lastFinishedPulling="2026-02-27 00:25:36.328747087 +0000 UTC m=+1205.586286641" observedRunningTime="2026-02-27 00:25:39.50105867 +0000 UTC m=+1208.758598224" watchObservedRunningTime="2026-02-27 00:25:39.503759681 +0000 UTC m=+1208.761299235" Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.174330 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.394105 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerStarted","Data":"a458867b742ce8b5b3fdd2c97ebf1845a6845fd00e046dd893821ec44de7237b"} Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.396020 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"58009056-4183-4017-bfa1-c14ce28b92ea","Type":"ContainerStarted","Data":"f357b9c4effa13cc010fc9a965fa4ab73f4546260126cc8790d5d9d2d0f6d40a"} Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.399786 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hcb9s" event={"ID":"9c2c498e-52b1-4ee2-bcf8-3599ee89513c","Type":"ContainerStarted","Data":"32c722853a219e907aa6805e822320dfcad64f8bcea351eff620f51e0edb3944"} Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.399832 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hcb9s" event={"ID":"9c2c498e-52b1-4ee2-bcf8-3599ee89513c","Type":"ContainerStarted","Data":"5b960311cd3cb59db0abef0b4fb7b64b110d4b25173600bae19c5bad1cf9cc14"} Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.400044 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.400076 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.448098 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-hcb9s" podStartSLOduration=17.607914563 podStartE2EDuration="33.448081391s" podCreationTimestamp="2026-02-27 00:25:07 +0000 UTC" firstStartedPulling="2026-02-27 00:25:20.015815443 +0000 UTC m=+1189.273354997" lastFinishedPulling="2026-02-27 00:25:35.855982271 +0000 UTC m=+1205.113521825" observedRunningTime="2026-02-27 00:25:40.442501055 +0000 UTC m=+1209.700040619" watchObservedRunningTime="2026-02-27 00:25:40.448081391 +0000 UTC m=+1209.705620945" Feb 27 00:25:42 crc kubenswrapper[4781]: I0227 00:25:42.413062 4781 generic.go:334] "Generic (PLEG): container finished" podID="d59d3864-af0d-407c-8431-ae2e17e4b46f" containerID="5bbf5b24d4d9008f5b75e4c0fa41c61058490ccdd91cacafc1c8d754b500b1d9" exitCode=0 Feb 27 00:25:42 crc kubenswrapper[4781]: I0227 00:25:42.413130 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d59d3864-af0d-407c-8431-ae2e17e4b46f","Type":"ContainerDied","Data":"5bbf5b24d4d9008f5b75e4c0fa41c61058490ccdd91cacafc1c8d754b500b1d9"} Feb 27 00:25:42 crc kubenswrapper[4781]: I0227 00:25:42.415301 4781 generic.go:334] "Generic (PLEG): container finished" podID="22624edd-e366-4aff-84dd-c3cec89c0591" containerID="560bdbc40641b47d07958de312ae75dadef9033fa90a1674b4544f472b537552" exitCode=0 Feb 27 00:25:42 crc kubenswrapper[4781]: I0227 00:25:42.415328 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"22624edd-e366-4aff-84dd-c3cec89c0591","Type":"ContainerDied","Data":"560bdbc40641b47d07958de312ae75dadef9033fa90a1674b4544f472b537552"} Feb 27 00:25:42 crc kubenswrapper[4781]: I0227 00:25:42.510678 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.174349 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.211384 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.425374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"22624edd-e366-4aff-84dd-c3cec89c0591","Type":"ContainerStarted","Data":"833f840d5f18e25dc205e5a7d1f612bc756ef91d8c591872057d4535f5197ff3"} Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.427875 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d59d3864-af0d-407c-8431-ae2e17e4b46f","Type":"ContainerStarted","Data":"ba009ef21836006b9ce41f8f3fecd84b090a6860eb2edc515cf9e9ff611d75de"} Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.451911 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.250317561 podStartE2EDuration="43.451891803s" podCreationTimestamp="2026-02-27 00:25:00 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.457754399 +0000 UTC m=+1187.715293953" lastFinishedPulling="2026-02-27 00:25:35.659328651 +0000 UTC m=+1204.916868195" observedRunningTime="2026-02-27 00:25:43.443211185 +0000 UTC m=+1212.700750779" watchObservedRunningTime="2026-02-27 00:25:43.451891803 +0000 UTC m=+1212.709431367" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.471170 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.298542248 podStartE2EDuration="44.471153149s" podCreationTimestamp="2026-02-27 00:24:59 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.184688654 +0000 UTC m=+1187.442228208" lastFinishedPulling="2026-02-27 00:25:35.357299555 +0000 UTC m=+1204.614839109" observedRunningTime="2026-02-27 00:25:43.465293785 +0000 UTC m=+1212.722833369" watchObservedRunningTime="2026-02-27 00:25:43.471153149 +0000 UTC m=+1212.728692713" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.485527 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.787006 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-86z46"] Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.787204 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-86z46" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerName="dnsmasq-dns" containerID="cri-o://60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0" gracePeriod=10 Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.816056 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xl8n7"] Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.817429 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.821971 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.834929 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xl8n7"] Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.907064 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-config\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.907170 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.907354 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.907559 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxvhb\" (UniqueName: \"kubernetes.io/projected/9c3386c0-cab4-47ef-b395-af90773f2796-kube-api-access-rxvhb\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.913959 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-hx85z"] Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.914993 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.924238 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.938246 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hx85z"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.011086 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.011178 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cf463d95-25dd-4b99-afb0-dac99157c5fa-ovn-rundir\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.011238 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.011295 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf463d95-25dd-4b99-afb0-dac99157c5fa-combined-ca-bundle\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012031 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012170 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.011325 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cf463d95-25dd-4b99-afb0-dac99157c5fa-ovs-rundir\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012358 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxvhb\" (UniqueName: \"kubernetes.io/projected/9c3386c0-cab4-47ef-b395-af90773f2796-kube-api-access-rxvhb\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012831 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrwsx\" (UniqueName: \"kubernetes.io/projected/cf463d95-25dd-4b99-afb0-dac99157c5fa-kube-api-access-rrwsx\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012855 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf463d95-25dd-4b99-afb0-dac99157c5fa-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012883 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf463d95-25dd-4b99-afb0-dac99157c5fa-config\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012911 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-config\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.013443 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-config\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.035861 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxvhb\" (UniqueName: \"kubernetes.io/projected/9c3386c0-cab4-47ef-b395-af90773f2796-kube-api-access-rxvhb\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.114801 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf463d95-25dd-4b99-afb0-dac99157c5fa-combined-ca-bundle\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.114855 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cf463d95-25dd-4b99-afb0-dac99157c5fa-ovs-rundir\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.114927 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrwsx\" (UniqueName: \"kubernetes.io/projected/cf463d95-25dd-4b99-afb0-dac99157c5fa-kube-api-access-rrwsx\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.114945 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf463d95-25dd-4b99-afb0-dac99157c5fa-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.114969 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf463d95-25dd-4b99-afb0-dac99157c5fa-config\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.115012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cf463d95-25dd-4b99-afb0-dac99157c5fa-ovn-rundir\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.115285 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cf463d95-25dd-4b99-afb0-dac99157c5fa-ovn-rundir\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.115330 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cf463d95-25dd-4b99-afb0-dac99157c5fa-ovs-rundir\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.116421 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf463d95-25dd-4b99-afb0-dac99157c5fa-config\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.118755 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf463d95-25dd-4b99-afb0-dac99157c5fa-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.134897 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.136430 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf463d95-25dd-4b99-afb0-dac99157c5fa-combined-ca-bundle\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.140751 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrwsx\" (UniqueName: \"kubernetes.io/projected/cf463d95-25dd-4b99-afb0-dac99157c5fa-kube-api-access-rrwsx\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.263506 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9h55"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.263727 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" podUID="0f2c76ec-cfab-4f18-b624-722021700885" containerName="dnsmasq-dns" containerID="cri-o://b86f56bb626afb20727e0941782a4964e5b9d6b09cdb0c43236b0144e5335336" gracePeriod=10 Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.273140 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.277522 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.307530 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4p8q8"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.321329 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.325939 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.351444 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4p8q8"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.425725 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.425813 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-config\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.425834 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4cks\" (UniqueName: \"kubernetes.io/projected/4e0e3c40-86af-4986-bf58-fa79ce187828-kube-api-access-j4cks\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.425859 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.425879 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.444310 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.456571 4781 generic.go:334] "Generic (PLEG): container finished" podID="0f2c76ec-cfab-4f18-b624-722021700885" containerID="b86f56bb626afb20727e0941782a4964e5b9d6b09cdb0c43236b0144e5335336" exitCode=0 Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.456769 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" event={"ID":"0f2c76ec-cfab-4f18-b624-722021700885","Type":"ContainerDied","Data":"b86f56bb626afb20727e0941782a4964e5b9d6b09cdb0c43236b0144e5335336"} Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.465798 4781 generic.go:334] "Generic (PLEG): container finished" podID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerID="60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0" exitCode=0 Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.468169 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.468462 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-86z46" event={"ID":"12142f3c-5849-4af1-8c9e-c92304d3c375","Type":"ContainerDied","Data":"60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0"} Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.468541 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-86z46" event={"ID":"12142f3c-5849-4af1-8c9e-c92304d3c375","Type":"ContainerDied","Data":"719bdff771357569191b4394e9f9c49ad9d8c251f65a73dabcbbf8b3ba8bdaa2"} Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.468567 4781 scope.go:117] "RemoveContainer" containerID="60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.514824 4781 scope.go:117] "RemoveContainer" containerID="27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.528973 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-dns-svc\") pod \"12142f3c-5849-4af1-8c9e-c92304d3c375\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529181 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gk5b\" (UniqueName: \"kubernetes.io/projected/12142f3c-5849-4af1-8c9e-c92304d3c375-kube-api-access-8gk5b\") pod \"12142f3c-5849-4af1-8c9e-c92304d3c375\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529241 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-config\") pod \"12142f3c-5849-4af1-8c9e-c92304d3c375\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529604 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529656 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529854 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529913 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4cks\" (UniqueName: \"kubernetes.io/projected/4e0e3c40-86af-4986-bf58-fa79ce187828-kube-api-access-j4cks\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529937 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-config\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.531185 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.531266 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-config\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.531283 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.531291 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.536872 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12142f3c-5849-4af1-8c9e-c92304d3c375-kube-api-access-8gk5b" (OuterVolumeSpecName: "kube-api-access-8gk5b") pod "12142f3c-5849-4af1-8c9e-c92304d3c375" (UID: "12142f3c-5849-4af1-8c9e-c92304d3c375"). InnerVolumeSpecName "kube-api-access-8gk5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.542419 4781 scope.go:117] "RemoveContainer" containerID="60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0" Feb 27 00:25:44 crc kubenswrapper[4781]: E0227 00:25:44.545952 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0\": container with ID starting with 60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0 not found: ID does not exist" containerID="60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.546012 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0"} err="failed to get container status \"60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0\": rpc error: code = NotFound desc = could not find container \"60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0\": container with ID starting with 60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0 not found: ID does not exist" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.546043 4781 scope.go:117] "RemoveContainer" containerID="27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43" Feb 27 00:25:44 crc kubenswrapper[4781]: E0227 00:25:44.546483 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43\": container with ID starting with 27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43 not found: ID does not exist" containerID="27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.546575 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43"} err="failed to get container status \"27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43\": rpc error: code = NotFound desc = could not find container \"27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43\": container with ID starting with 27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43 not found: ID does not exist" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.551453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4cks\" (UniqueName: \"kubernetes.io/projected/4e0e3c40-86af-4986-bf58-fa79ce187828-kube-api-access-j4cks\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.593988 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "12142f3c-5849-4af1-8c9e-c92304d3c375" (UID: "12142f3c-5849-4af1-8c9e-c92304d3c375"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.597043 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-config" (OuterVolumeSpecName: "config") pod "12142f3c-5849-4af1-8c9e-c92304d3c375" (UID: "12142f3c-5849-4af1-8c9e-c92304d3c375"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.631446 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gk5b\" (UniqueName: \"kubernetes.io/projected/12142f3c-5849-4af1-8c9e-c92304d3c375-kube-api-access-8gk5b\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.631473 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.631483 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.730016 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.850696 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xl8n7"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.872707 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-86z46"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.879715 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-86z46"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.905696 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xl8n7"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.921229 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc44h"] Feb 27 00:25:44 crc kubenswrapper[4781]: E0227 00:25:44.921580 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerName="init" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.921598 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerName="init" Feb 27 00:25:44 crc kubenswrapper[4781]: E0227 00:25:44.921616 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerName="dnsmasq-dns" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.921634 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerName="dnsmasq-dns" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.921797 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerName="dnsmasq-dns" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.922664 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.966052 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc44h"] Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.049529 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-config\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.049966 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jz5b\" (UniqueName: \"kubernetes.io/projected/8e37b0a7-69ac-439e-9c5a-207210fe40c8-kube-api-access-8jz5b\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.050007 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.050025 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.050077 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-dns-svc\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.153608 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-dns-svc\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.153678 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-config\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.153759 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jz5b\" (UniqueName: \"kubernetes.io/projected/8e37b0a7-69ac-439e-9c5a-207210fe40c8-kube-api-access-8jz5b\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.153801 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.153815 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.154681 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-config\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.154687 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.155221 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.158680 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-dns-svc\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.168897 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hx85z"] Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.177402 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jz5b\" (UniqueName: \"kubernetes.io/projected/8e37b0a7-69ac-439e-9c5a-207210fe40c8-kube-api-access-8jz5b\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.186858 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:25:45 crc kubenswrapper[4781]: W0227 00:25:45.190120 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf463d95_25dd_4b99_afb0_dac99157c5fa.slice/crio-691968953b9bd97fdf1a1b394531474f9e18fddde0d7c4e11d17dbcfe9a75c62 WatchSource:0}: Error finding container 691968953b9bd97fdf1a1b394531474f9e18fddde0d7c4e11d17dbcfe9a75c62: Status 404 returned error can't find the container with id 691968953b9bd97fdf1a1b394531474f9e18fddde0d7c4e11d17dbcfe9a75c62 Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.327264 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" path="/var/lib/kubelet/pods/12142f3c-5849-4af1-8c9e-c92304d3c375/volumes" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.356822 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-dns-svc\") pod \"0f2c76ec-cfab-4f18-b624-722021700885\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.357139 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw2z9\" (UniqueName: \"kubernetes.io/projected/0f2c76ec-cfab-4f18-b624-722021700885-kube-api-access-fw2z9\") pod \"0f2c76ec-cfab-4f18-b624-722021700885\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.357167 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-config\") pod \"0f2c76ec-cfab-4f18-b624-722021700885\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.363386 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2c76ec-cfab-4f18-b624-722021700885-kube-api-access-fw2z9" (OuterVolumeSpecName: "kube-api-access-fw2z9") pod "0f2c76ec-cfab-4f18-b624-722021700885" (UID: "0f2c76ec-cfab-4f18-b624-722021700885"). InnerVolumeSpecName "kube-api-access-fw2z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.415191 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-config" (OuterVolumeSpecName: "config") pod "0f2c76ec-cfab-4f18-b624-722021700885" (UID: "0f2c76ec-cfab-4f18-b624-722021700885"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.417177 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f2c76ec-cfab-4f18-b624-722021700885" (UID: "0f2c76ec-cfab-4f18-b624-722021700885"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.459526 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.459567 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw2z9\" (UniqueName: \"kubernetes.io/projected/0f2c76ec-cfab-4f18-b624-722021700885-kube-api-access-fw2z9\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.459579 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.476148 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" event={"ID":"0f2c76ec-cfab-4f18-b624-722021700885","Type":"ContainerDied","Data":"893d8f643c3cbe02cd19b59bf3115d432c587df0f05ea410ba0d0253101d7031"} Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.476210 4781 scope.go:117] "RemoveContainer" containerID="b86f56bb626afb20727e0941782a4964e5b9d6b09cdb0c43236b0144e5335336" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.476320 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.482161 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.497993 4781 generic.go:334] "Generic (PLEG): container finished" podID="9c3386c0-cab4-47ef-b395-af90773f2796" containerID="8a7c06cfb85e979a54da7be5977198a52079b1205854f4d374a36e5f20b42991" exitCode=0 Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.498083 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" event={"ID":"9c3386c0-cab4-47ef-b395-af90773f2796","Type":"ContainerDied","Data":"8a7c06cfb85e979a54da7be5977198a52079b1205854f4d374a36e5f20b42991"} Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.498110 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" event={"ID":"9c3386c0-cab4-47ef-b395-af90773f2796","Type":"ContainerStarted","Data":"a67b45e33a1d1e91736f217aef1a74906650aa1373c29592e7ce67091189b528"} Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.507050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hx85z" event={"ID":"cf463d95-25dd-4b99-afb0-dac99157c5fa","Type":"ContainerStarted","Data":"691968953b9bd97fdf1a1b394531474f9e18fddde0d7c4e11d17dbcfe9a75c62"} Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.530765 4781 scope.go:117] "RemoveContainer" containerID="2b13d210b4d0aa474faa06d53125b956502bdd360564f39dba668453c12f7cac" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.555531 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9h55"] Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.569380 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9h55"] Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.577552 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4p8q8"] Feb 27 00:25:45 crc kubenswrapper[4781]: W0227 00:25:45.622032 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e0e3c40_86af_4986_bf58_fa79ce187828.slice/crio-5f275f16796eb838b7768e9505dd01e92ce2776a3d08fa355285fb9c70fdc93f WatchSource:0}: Error finding container 5f275f16796eb838b7768e9505dd01e92ce2776a3d08fa355285fb9c70fdc93f: Status 404 returned error can't find the container with id 5f275f16796eb838b7768e9505dd01e92ce2776a3d08fa355285fb9c70fdc93f Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.886722 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.975170 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-ovsdbserver-sb\") pod \"9c3386c0-cab4-47ef-b395-af90773f2796\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.975586 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-dns-svc\") pod \"9c3386c0-cab4-47ef-b395-af90773f2796\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.976288 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxvhb\" (UniqueName: \"kubernetes.io/projected/9c3386c0-cab4-47ef-b395-af90773f2796-kube-api-access-rxvhb\") pod \"9c3386c0-cab4-47ef-b395-af90773f2796\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.976341 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-config\") pod \"9c3386c0-cab4-47ef-b395-af90773f2796\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.985846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3386c0-cab4-47ef-b395-af90773f2796-kube-api-access-rxvhb" (OuterVolumeSpecName: "kube-api-access-rxvhb") pod "9c3386c0-cab4-47ef-b395-af90773f2796" (UID: "9c3386c0-cab4-47ef-b395-af90773f2796"). InnerVolumeSpecName "kube-api-access-rxvhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.000362 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9c3386c0-cab4-47ef-b395-af90773f2796" (UID: "9c3386c0-cab4-47ef-b395-af90773f2796"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.006232 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-config" (OuterVolumeSpecName: "config") pod "9c3386c0-cab4-47ef-b395-af90773f2796" (UID: "9c3386c0-cab4-47ef-b395-af90773f2796"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.014551 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c3386c0-cab4-47ef-b395-af90773f2796" (UID: "9c3386c0-cab4-47ef-b395-af90773f2796"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.078977 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxvhb\" (UniqueName: \"kubernetes.io/projected/9c3386c0-cab4-47ef-b395-af90773f2796-kube-api-access-rxvhb\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.079015 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.079028 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.079060 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.107196 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc44h"] Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.130575 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.131593 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2c76ec-cfab-4f18-b624-722021700885" containerName="init" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.131655 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2c76ec-cfab-4f18-b624-722021700885" containerName="init" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.131719 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2c76ec-cfab-4f18-b624-722021700885" containerName="dnsmasq-dns" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.131728 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2c76ec-cfab-4f18-b624-722021700885" containerName="dnsmasq-dns" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.131755 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3386c0-cab4-47ef-b395-af90773f2796" containerName="init" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.131763 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3386c0-cab4-47ef-b395-af90773f2796" containerName="init" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.132076 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3386c0-cab4-47ef-b395-af90773f2796" containerName="init" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.132127 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f2c76ec-cfab-4f18-b624-722021700885" containerName="dnsmasq-dns" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.141295 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.146158 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.146422 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lg72x" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.146574 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.147532 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.149681 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.285980 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.286049 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.286071 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-lock\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.286102 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.286140 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc7d6\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-kube-api-access-vc7d6\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.286163 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-cache\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.387846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc7d6\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-kube-api-access-vc7d6\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.387905 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-cache\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.388085 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.388148 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.388177 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-lock\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.388218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.388893 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.388914 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.388962 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift podName:fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11 nodeName:}" failed. No retries permitted until 2026-02-27 00:25:46.888942773 +0000 UTC m=+1216.146482327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift") pod "swift-storage-0" (UID: "fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11") : configmap "swift-ring-files" not found Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.389280 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-cache\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.389837 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-lock\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.393040 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.398584 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.398635 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cd48650d90eb686d90983f7c34ee50ce064964e161dc3cb092803e098630b47a/globalmount\"" pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.403223 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc7d6\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-kube-api-access-vc7d6\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.428621 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.517699 4781 generic.go:334] "Generic (PLEG): container finished" podID="58009056-4183-4017-bfa1-c14ce28b92ea" containerID="f357b9c4effa13cc010fc9a965fa4ab73f4546260126cc8790d5d9d2d0f6d40a" exitCode=0 Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.517792 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"58009056-4183-4017-bfa1-c14ce28b92ea","Type":"ContainerDied","Data":"f357b9c4effa13cc010fc9a965fa4ab73f4546260126cc8790d5d9d2d0f6d40a"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.520541 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" event={"ID":"9c3386c0-cab4-47ef-b395-af90773f2796","Type":"ContainerDied","Data":"a67b45e33a1d1e91736f217aef1a74906650aa1373c29592e7ce67091189b528"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.520619 4781 scope.go:117] "RemoveContainer" containerID="8a7c06cfb85e979a54da7be5977198a52079b1205854f4d374a36e5f20b42991" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.520564 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.526404 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc44h" event={"ID":"8e37b0a7-69ac-439e-9c5a-207210fe40c8","Type":"ContainerStarted","Data":"e1839c0058f09c92d633d8b44bcde9496faf128970e2a8993b81a296f21aac5b"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.533025 4781 generic.go:334] "Generic (PLEG): container finished" podID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerID="a458867b742ce8b5b3fdd2c97ebf1845a6845fd00e046dd893821ec44de7237b" exitCode=0 Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.533312 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerDied","Data":"a458867b742ce8b5b3fdd2c97ebf1845a6845fd00e046dd893821ec44de7237b"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.536167 4781 generic.go:334] "Generic (PLEG): container finished" podID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerID="aabc0546990906abca2b35d28ee124be17d9e0ff7483abdc7c60ccee1cba5f86" exitCode=0 Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.536296 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" event={"ID":"4e0e3c40-86af-4986-bf58-fa79ce187828","Type":"ContainerDied","Data":"aabc0546990906abca2b35d28ee124be17d9e0ff7483abdc7c60ccee1cba5f86"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.536332 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" event={"ID":"4e0e3c40-86af-4986-bf58-fa79ce187828","Type":"ContainerStarted","Data":"5f275f16796eb838b7768e9505dd01e92ce2776a3d08fa355285fb9c70fdc93f"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.546110 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hx85z" event={"ID":"cf463d95-25dd-4b99-afb0-dac99157c5fa","Type":"ContainerStarted","Data":"81e77b5bfd1fd87b6a34ee8c16757e47b1ff59fe040db1ca38dbbecf4c29c9ba"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.548232 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-6n9rn"] Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.551508 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.553499 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.555471 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.555484 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.556220 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6n9rn"] Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.580696 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-hx85z" podStartSLOduration=3.5806029329999998 podStartE2EDuration="3.580602933s" podCreationTimestamp="2026-02-27 00:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:25:46.569590713 +0000 UTC m=+1215.827130277" watchObservedRunningTime="2026-02-27 00:25:46.580602933 +0000 UTC m=+1215.838142497" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.686059 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xl8n7"] Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693378 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-swiftconf\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693475 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-ring-data-devices\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693573 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-scripts\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693613 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-combined-ca-bundle\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693687 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6d9g\" (UniqueName: \"kubernetes.io/projected/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-kube-api-access-k6d9g\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693815 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-dispersionconf\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693837 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-etc-swift\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.706165 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xl8n7"] Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.795653 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-swiftconf\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.795714 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-ring-data-devices\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.795783 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-scripts\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.795808 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-combined-ca-bundle\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.795849 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6d9g\" (UniqueName: \"kubernetes.io/projected/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-kube-api-access-k6d9g\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.795914 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-dispersionconf\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.796680 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-etc-swift\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.796738 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-ring-data-devices\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.796862 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-scripts\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.797057 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-etc-swift\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.799684 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-combined-ca-bundle\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.800938 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-dispersionconf\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.801173 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-swiftconf\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.813865 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6d9g\" (UniqueName: \"kubernetes.io/projected/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-kube-api-access-k6d9g\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.854614 4781 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 27 00:25:46 crc kubenswrapper[4781]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/4e0e3c40-86af-4986-bf58-fa79ce187828/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 27 00:25:46 crc kubenswrapper[4781]: > podSandboxID="5f275f16796eb838b7768e9505dd01e92ce2776a3d08fa355285fb9c70fdc93f" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.854841 4781 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 00:25:46 crc kubenswrapper[4781]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j4cks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86db49b7ff-4p8q8_openstack(4e0e3c40-86af-4986-bf58-fa79ce187828): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/4e0e3c40-86af-4986-bf58-fa79ce187828/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 27 00:25:46 crc kubenswrapper[4781]: > logger="UnhandledError" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.856106 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/4e0e3c40-86af-4986-bf58-fa79ce187828/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.868501 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.902143 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.902371 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.902406 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.902481 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift podName:fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11 nodeName:}" failed. No retries permitted until 2026-02-27 00:25:47.902459898 +0000 UTC m=+1217.159999452 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift") pod "swift-storage-0" (UID: "fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11") : configmap "swift-ring-files" not found Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.352869 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f2c76ec-cfab-4f18-b624-722021700885" path="/var/lib/kubelet/pods/0f2c76ec-cfab-4f18-b624-722021700885/volumes" Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.355329 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3386c0-cab4-47ef-b395-af90773f2796" path="/var/lib/kubelet/pods/9c3386c0-cab4-47ef-b395-af90773f2796/volumes" Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.388282 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6n9rn"] Feb 27 00:25:47 crc kubenswrapper[4781]: E0227 00:25:47.508622 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e37b0a7_69ac_439e_9c5a_207210fe40c8.slice/crio-conmon-31cd21a634eff04c79df7b5ee8d37fc4cdb1a4b5a72c57fc0d9aca1961c28780.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e37b0a7_69ac_439e_9c5a_207210fe40c8.slice/crio-31cd21a634eff04c79df7b5ee8d37fc4cdb1a4b5a72c57fc0d9aca1961c28780.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.566560 4781 generic.go:334] "Generic (PLEG): container finished" podID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerID="31cd21a634eff04c79df7b5ee8d37fc4cdb1a4b5a72c57fc0d9aca1961c28780" exitCode=0 Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.566659 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc44h" event={"ID":"8e37b0a7-69ac-439e-9c5a-207210fe40c8","Type":"ContainerDied","Data":"31cd21a634eff04c79df7b5ee8d37fc4cdb1a4b5a72c57fc0d9aca1961c28780"} Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.568698 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6n9rn" event={"ID":"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b","Type":"ContainerStarted","Data":"ca26accad7ac480d16da11e818bc3769c592f1c77082ff70a7fdd81af22f0086"} Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.935035 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:47 crc kubenswrapper[4781]: E0227 00:25:47.935252 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 00:25:47 crc kubenswrapper[4781]: E0227 00:25:47.935459 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 00:25:47 crc kubenswrapper[4781]: E0227 00:25:47.935518 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift podName:fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11 nodeName:}" failed. No retries permitted until 2026-02-27 00:25:49.935501207 +0000 UTC m=+1219.193040761 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift") pod "swift-storage-0" (UID: "fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11") : configmap "swift-ring-files" not found Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.580152 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" event={"ID":"877c39ec-0202-4987-b6e7-4fb90c4dc9b5","Type":"ContainerStarted","Data":"d0f2b15fcdbb4ac98c885abfd6fcaf06d850f08cfcf87c7b00d69c42cb65e362"} Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.581023 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.585932 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" event={"ID":"4e0e3c40-86af-4986-bf58-fa79ce187828","Type":"ContainerStarted","Data":"5e167781defb17ca0ec0a66f7db9a5b9464f7410f615b9d338fc5bcbdaa6963b"} Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.586245 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.590636 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"42503ae1-b143-45c3-8789-e2d1f72cc335","Type":"ContainerStarted","Data":"26ec8ae768d4844b138da07d723a9517917ce9493f3ade7645757aaba568e403"} Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.590929 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.596566 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc44h" event={"ID":"8e37b0a7-69ac-439e-9c5a-207210fe40c8","Type":"ContainerStarted","Data":"e2bf980506549d387ee967a300bd50ff50a9e4489a44bcc2c952a5e2c00137a5"} Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.597135 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.607340 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" podStartSLOduration=5.629067275 podStartE2EDuration="33.607329735s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.466931981 +0000 UTC m=+1188.724471545" lastFinishedPulling="2026-02-27 00:25:47.445194451 +0000 UTC m=+1216.702734005" observedRunningTime="2026-02-27 00:25:48.607055088 +0000 UTC m=+1217.864594682" watchObservedRunningTime="2026-02-27 00:25:48.607329735 +0000 UTC m=+1217.864869289" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.623005 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.635486 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=-9223372003.219313 podStartE2EDuration="33.635463904s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.492249535 +0000 UTC m=+1188.749789089" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:25:48.625699947 +0000 UTC m=+1217.883239501" watchObservedRunningTime="2026-02-27 00:25:48.635463904 +0000 UTC m=+1217.893003458" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.642670 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" podStartSLOduration=4.642642232 podStartE2EDuration="4.642642232s" podCreationTimestamp="2026-02-27 00:25:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:25:48.642076407 +0000 UTC m=+1217.899615961" watchObservedRunningTime="2026-02-27 00:25:48.642642232 +0000 UTC m=+1217.900181786" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.667194 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-gc44h" podStartSLOduration=4.667176936 podStartE2EDuration="4.667176936s" podCreationTimestamp="2026-02-27 00:25:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:25:48.663829328 +0000 UTC m=+1217.921368882" watchObservedRunningTime="2026-02-27 00:25:48.667176936 +0000 UTC m=+1217.924716490" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.026222 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:50 crc kubenswrapper[4781]: E0227 00:25:50.026562 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 00:25:50 crc kubenswrapper[4781]: E0227 00:25:50.027197 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 00:25:50 crc kubenswrapper[4781]: E0227 00:25:50.027310 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift podName:fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11 nodeName:}" failed. No retries permitted until 2026-02-27 00:25:54.027279956 +0000 UTC m=+1223.284819510 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift") pod "swift-storage-0" (UID: "fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11") : configmap "swift-ring-files" not found Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.634539 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" event={"ID":"233250c8-3871-43ec-8c1d-47bd1d3133e1","Type":"ContainerStarted","Data":"ab5e0848a0d2d1d0af3adfd42b67a8503712e61416ac52d021b088ec1c10cde8"} Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.635252 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.640166 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"91997a3e-9e65-4eab-a0b9-8f9c639a8d05","Type":"ContainerStarted","Data":"59ed5bb57f5c002905a336da46ce8019d8424d181ddbd01fd683c6c25bea9d90"} Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.640405 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.646089 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" event={"ID":"d9e3acc2-cee4-4bfe-af04-3a64041fc327","Type":"ContainerStarted","Data":"2a2354aca49bd460af5e535cec347a181edf8596ec560070d98bfc1ab2bff2e0"} Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.646507 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.653176 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" podStartSLOduration=-9223372001.20162 podStartE2EDuration="35.65315532s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.459858315 +0000 UTC m=+1188.717397869" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:25:50.650423908 +0000 UTC m=+1219.907963462" watchObservedRunningTime="2026-02-27 00:25:50.65315532 +0000 UTC m=+1219.910694874" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.663817 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.675691 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" podStartSLOduration=-9223372001.179104 podStartE2EDuration="35.675672791s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.447545972 +0000 UTC m=+1188.705085526" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:25:50.672752624 +0000 UTC m=+1219.930292178" watchObservedRunningTime="2026-02-27 00:25:50.675672791 +0000 UTC m=+1219.933212365" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.705029 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.707234587 podStartE2EDuration="46.705011471s" podCreationTimestamp="2026-02-27 00:25:04 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.862539911 +0000 UTC m=+1188.120079465" lastFinishedPulling="2026-02-27 00:25:49.860316795 +0000 UTC m=+1219.117856349" observedRunningTime="2026-02-27 00:25:50.691788264 +0000 UTC m=+1219.949327818" watchObservedRunningTime="2026-02-27 00:25:50.705011471 +0000 UTC m=+1219.962551025" Feb 27 00:25:51 crc kubenswrapper[4781]: I0227 00:25:51.068393 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 27 00:25:51 crc kubenswrapper[4781]: I0227 00:25:51.068448 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 27 00:25:51 crc kubenswrapper[4781]: I0227 00:25:51.186340 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 27 00:25:51 crc kubenswrapper[4781]: I0227 00:25:51.654793 4781 generic.go:334] "Generic (PLEG): container finished" podID="919ba171-1971-416c-99c1-5dfcacc10a28" containerID="96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7" exitCode=0 Feb 27 00:25:51 crc kubenswrapper[4781]: I0227 00:25:51.654854 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"919ba171-1971-416c-99c1-5dfcacc10a28","Type":"ContainerDied","Data":"96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7"} Feb 27 00:25:51 crc kubenswrapper[4781]: I0227 00:25:51.736343 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.245177 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.245591 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.325053 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.732872 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.860056 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8c9b-account-create-update-d29bm"] Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.861412 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.872155 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jrxqx"] Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.873346 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.877194 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.891601 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jrxqx"] Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.911170 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8c9b-account-create-update-d29bm"] Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.993996 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chshx\" (UniqueName: \"kubernetes.io/projected/f1713962-9458-45b2-9f28-61409b7ff581-kube-api-access-chshx\") pod \"glance-db-create-jrxqx\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.994033 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb4687ec-812e-48bb-8d53-ed628f3cd013-operator-scripts\") pod \"glance-8c9b-account-create-update-d29bm\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.994073 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1713962-9458-45b2-9f28-61409b7ff581-operator-scripts\") pod \"glance-db-create-jrxqx\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.994370 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2c6b\" (UniqueName: \"kubernetes.io/projected/bb4687ec-812e-48bb-8d53-ed628f3cd013-kube-api-access-l2c6b\") pod \"glance-8c9b-account-create-update-d29bm\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.096437 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2c6b\" (UniqueName: \"kubernetes.io/projected/bb4687ec-812e-48bb-8d53-ed628f3cd013-kube-api-access-l2c6b\") pod \"glance-8c9b-account-create-update-d29bm\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.096961 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chshx\" (UniqueName: \"kubernetes.io/projected/f1713962-9458-45b2-9f28-61409b7ff581-kube-api-access-chshx\") pod \"glance-db-create-jrxqx\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.096986 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb4687ec-812e-48bb-8d53-ed628f3cd013-operator-scripts\") pod \"glance-8c9b-account-create-update-d29bm\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.097021 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1713962-9458-45b2-9f28-61409b7ff581-operator-scripts\") pod \"glance-db-create-jrxqx\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.097778 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1713962-9458-45b2-9f28-61409b7ff581-operator-scripts\") pod \"glance-db-create-jrxqx\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.097837 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb4687ec-812e-48bb-8d53-ed628f3cd013-operator-scripts\") pod \"glance-8c9b-account-create-update-d29bm\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.129740 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chshx\" (UniqueName: \"kubernetes.io/projected/f1713962-9458-45b2-9f28-61409b7ff581-kube-api-access-chshx\") pod \"glance-db-create-jrxqx\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.129930 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2c6b\" (UniqueName: \"kubernetes.io/projected/bb4687ec-812e-48bb-8d53-ed628f3cd013-kube-api-access-l2c6b\") pod \"glance-8c9b-account-create-update-d29bm\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.201617 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.285849 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.574131 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-f66vm"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.575272 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.585487 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-f66vm"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.648426 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5255-account-create-update-k87hd"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.650043 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.652743 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.663214 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5255-account-create-update-k87hd"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.675079 4781 generic.go:334] "Generic (PLEG): container finished" podID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerID="592b25e10dba92f06ec6db612c25fdc12d9afc456496a972e547225b9ac93f91" exitCode=0 Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.675115 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c7ca2a9f-a42e-4d9b-89a7-f2590842f328","Type":"ContainerDied","Data":"592b25e10dba92f06ec6db612c25fdc12d9afc456496a972e547225b9ac93f91"} Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.706806 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6016e5-2641-4b82-b164-121ae822f863-operator-scripts\") pod \"keystone-5255-account-create-update-k87hd\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.706860 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpskz\" (UniqueName: \"kubernetes.io/projected/0cec0cd3-abcd-484c-85b8-03a44888a9b7-kube-api-access-dpskz\") pod \"keystone-db-create-f66vm\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.707935 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cec0cd3-abcd-484c-85b8-03a44888a9b7-operator-scripts\") pod \"keystone-db-create-f66vm\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.708173 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfbcp\" (UniqueName: \"kubernetes.io/projected/2c6016e5-2641-4b82-b164-121ae822f863-kube-api-access-zfbcp\") pod \"keystone-5255-account-create-update-k87hd\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.810508 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cec0cd3-abcd-484c-85b8-03a44888a9b7-operator-scripts\") pod \"keystone-db-create-f66vm\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.810735 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfbcp\" (UniqueName: \"kubernetes.io/projected/2c6016e5-2641-4b82-b164-121ae822f863-kube-api-access-zfbcp\") pod \"keystone-5255-account-create-update-k87hd\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.810816 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6016e5-2641-4b82-b164-121ae822f863-operator-scripts\") pod \"keystone-5255-account-create-update-k87hd\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.810852 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpskz\" (UniqueName: \"kubernetes.io/projected/0cec0cd3-abcd-484c-85b8-03a44888a9b7-kube-api-access-dpskz\") pod \"keystone-db-create-f66vm\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.811455 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cec0cd3-abcd-484c-85b8-03a44888a9b7-operator-scripts\") pod \"keystone-db-create-f66vm\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.812310 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6016e5-2641-4b82-b164-121ae822f863-operator-scripts\") pod \"keystone-5255-account-create-update-k87hd\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.827303 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfbcp\" (UniqueName: \"kubernetes.io/projected/2c6016e5-2641-4b82-b164-121ae822f863-kube-api-access-zfbcp\") pod \"keystone-5255-account-create-update-k87hd\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.830861 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpskz\" (UniqueName: \"kubernetes.io/projected/0cec0cd3-abcd-484c-85b8-03a44888a9b7-kube-api-access-dpskz\") pod \"keystone-db-create-f66vm\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.900213 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.911171 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-94sd2"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.912707 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-94sd2" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.920464 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-94sd2"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.964591 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.974341 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-547a-account-create-update-tf2pb"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.992480 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.005817 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-547a-account-create-update-tf2pb"] Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.008396 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.017748 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90ad80e-9897-4e20-b9b0-6add43c84bd0-operator-scripts\") pod \"placement-db-create-94sd2\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.018148 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j978s\" (UniqueName: \"kubernetes.io/projected/c90ad80e-9897-4e20-b9b0-6add43c84bd0-kube-api-access-j978s\") pod \"placement-db-create-94sd2\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.018206 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlll5\" (UniqueName: \"kubernetes.io/projected/6bdd8664-6d91-4616-8095-f44067fdca51-kube-api-access-zlll5\") pod \"placement-547a-account-create-update-tf2pb\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.018312 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdd8664-6d91-4616-8095-f44067fdca51-operator-scripts\") pod \"placement-547a-account-create-update-tf2pb\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.119661 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:54 crc kubenswrapper[4781]: E0227 00:25:54.119879 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 00:25:54 crc kubenswrapper[4781]: E0227 00:25:54.120075 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 00:25:54 crc kubenswrapper[4781]: E0227 00:25:54.120129 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift podName:fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11 nodeName:}" failed. No retries permitted until 2026-02-27 00:26:02.120110236 +0000 UTC m=+1231.377649790 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift") pod "swift-storage-0" (UID: "fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11") : configmap "swift-ring-files" not found Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.120044 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j978s\" (UniqueName: \"kubernetes.io/projected/c90ad80e-9897-4e20-b9b0-6add43c84bd0-kube-api-access-j978s\") pod \"placement-db-create-94sd2\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.120224 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlll5\" (UniqueName: \"kubernetes.io/projected/6bdd8664-6d91-4616-8095-f44067fdca51-kube-api-access-zlll5\") pod \"placement-547a-account-create-update-tf2pb\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.120352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdd8664-6d91-4616-8095-f44067fdca51-operator-scripts\") pod \"placement-547a-account-create-update-tf2pb\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.120415 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90ad80e-9897-4e20-b9b0-6add43c84bd0-operator-scripts\") pod \"placement-db-create-94sd2\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.121066 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdd8664-6d91-4616-8095-f44067fdca51-operator-scripts\") pod \"placement-547a-account-create-update-tf2pb\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.121129 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90ad80e-9897-4e20-b9b0-6add43c84bd0-operator-scripts\") pod \"placement-db-create-94sd2\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.136423 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j978s\" (UniqueName: \"kubernetes.io/projected/c90ad80e-9897-4e20-b9b0-6add43c84bd0-kube-api-access-j978s\") pod \"placement-db-create-94sd2\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.137655 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlll5\" (UniqueName: \"kubernetes.io/projected/6bdd8664-6d91-4616-8095-f44067fdca51-kube-api-access-zlll5\") pod \"placement-547a-account-create-update-tf2pb\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.231015 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.319679 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.732773 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.484810 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.556762 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4p8q8"] Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.693099 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb" event={"ID":"092921e0-a033-4021-b0f5-9c89de3aa830","Type":"ContainerStarted","Data":"c6cb33670dd8b2bfdec6a38612fc1c0b4ed5fb2f72c14df8421862c9e5769e03"} Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.693278 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="dnsmasq-dns" containerID="cri-o://5e167781defb17ca0ec0a66f7db9a5b9464f7410f615b9d338fc5bcbdaa6963b" gracePeriod=10 Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.693698 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.828145 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.843046 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9zkpb" podStartSLOduration=17.842277554 podStartE2EDuration="48.843023726s" podCreationTimestamp="2026-02-27 00:25:07 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.862978573 +0000 UTC m=+1188.120518127" lastFinishedPulling="2026-02-27 00:25:49.863724745 +0000 UTC m=+1219.121264299" observedRunningTime="2026-02-27 00:25:55.719560696 +0000 UTC m=+1224.977100250" watchObservedRunningTime="2026-02-27 00:25:55.843023726 +0000 UTC m=+1225.100563300" Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.955221 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:56 crc kubenswrapper[4781]: I0227 00:25:56.705548 4781 generic.go:334] "Generic (PLEG): container finished" podID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerID="5e167781defb17ca0ec0a66f7db9a5b9464f7410f615b9d338fc5bcbdaa6963b" exitCode=0 Feb 27 00:25:56 crc kubenswrapper[4781]: I0227 00:25:56.705642 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" event={"ID":"4e0e3c40-86af-4986-bf58-fa79ce187828","Type":"ContainerDied","Data":"5e167781defb17ca0ec0a66f7db9a5b9464f7410f615b9d338fc5bcbdaa6963b"} Feb 27 00:25:57 crc kubenswrapper[4781]: I0227 00:25:57.057391 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="2691e066-2f4c-4e7e-bcac-01933bd6cadb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 27 00:25:57 crc kubenswrapper[4781]: I0227 00:25:57.106082 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.405538 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-mmj84"] Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.409077 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.412596 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.426963 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mmj84"] Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.545731 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjn7t\" (UniqueName: \"kubernetes.io/projected/c986902c-3a54-4300-a078-2e70d305e97e-kube-api-access-tjn7t\") pod \"root-account-create-update-mmj84\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.545818 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c986902c-3a54-4300-a078-2e70d305e97e-operator-scripts\") pod \"root-account-create-update-mmj84\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.648425 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjn7t\" (UniqueName: \"kubernetes.io/projected/c986902c-3a54-4300-a078-2e70d305e97e-kube-api-access-tjn7t\") pod \"root-account-create-update-mmj84\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.648504 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c986902c-3a54-4300-a078-2e70d305e97e-operator-scripts\") pod \"root-account-create-update-mmj84\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.649428 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c986902c-3a54-4300-a078-2e70d305e97e-operator-scripts\") pod \"root-account-create-update-mmj84\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.672853 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjn7t\" (UniqueName: \"kubernetes.io/projected/c986902c-3a54-4300-a078-2e70d305e97e-kube-api-access-tjn7t\") pod \"root-account-create-update-mmj84\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.731495 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.752401 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mmj84" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.136974 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535866-qpv8l"] Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.138257 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.140970 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.141507 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.141716 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.154240 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535866-qpv8l"] Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.258425 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z28p\" (UniqueName: \"kubernetes.io/projected/1fe4edac-acb6-4906-9b3b-42b7c7a98943-kube-api-access-7z28p\") pod \"auto-csr-approver-29535866-qpv8l\" (UID: \"1fe4edac-acb6-4906-9b3b-42b7c7a98943\") " pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.361540 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z28p\" (UniqueName: \"kubernetes.io/projected/1fe4edac-acb6-4906-9b3b-42b7c7a98943-kube-api-access-7z28p\") pod \"auto-csr-approver-29535866-qpv8l\" (UID: \"1fe4edac-acb6-4906-9b3b-42b7c7a98943\") " pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.378554 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z28p\" (UniqueName: \"kubernetes.io/projected/1fe4edac-acb6-4906-9b3b-42b7c7a98943-kube-api-access-7z28p\") pod \"auto-csr-approver-29535866-qpv8l\" (UID: \"1fe4edac-acb6-4906-9b3b-42b7c7a98943\") " pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.466265 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.956231 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jrxqx"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.011297 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.080795 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-config\") pod \"4e0e3c40-86af-4986-bf58-fa79ce187828\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.080826 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-dns-svc\") pod \"4e0e3c40-86af-4986-bf58-fa79ce187828\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.080892 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-nb\") pod \"4e0e3c40-86af-4986-bf58-fa79ce187828\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.080968 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-sb\") pod \"4e0e3c40-86af-4986-bf58-fa79ce187828\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.081040 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4cks\" (UniqueName: \"kubernetes.io/projected/4e0e3c40-86af-4986-bf58-fa79ce187828-kube-api-access-j4cks\") pod \"4e0e3c40-86af-4986-bf58-fa79ce187828\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.145898 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e0e3c40-86af-4986-bf58-fa79ce187828-kube-api-access-j4cks" (OuterVolumeSpecName: "kube-api-access-j4cks") pod "4e0e3c40-86af-4986-bf58-fa79ce187828" (UID: "4e0e3c40-86af-4986-bf58-fa79ce187828"). InnerVolumeSpecName "kube-api-access-j4cks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.183725 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4cks\" (UniqueName: \"kubernetes.io/projected/4e0e3c40-86af-4986-bf58-fa79ce187828-kube-api-access-j4cks\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.278767 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8c9b-account-create-update-d29bm"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.428905 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e0e3c40-86af-4986-bf58-fa79ce187828" (UID: "4e0e3c40-86af-4986-bf58-fa79ce187828"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.434751 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-547a-account-create-update-tf2pb"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.473170 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e0e3c40-86af-4986-bf58-fa79ce187828" (UID: "4e0e3c40-86af-4986-bf58-fa79ce187828"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.483973 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-config" (OuterVolumeSpecName: "config") pod "4e0e3c40-86af-4986-bf58-fa79ce187828" (UID: "4e0e3c40-86af-4986-bf58-fa79ce187828"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.489728 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.489757 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.489771 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.515743 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e0e3c40-86af-4986-bf58-fa79ce187828" (UID: "4e0e3c40-86af-4986-bf58-fa79ce187828"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.591677 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.755051 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-547a-account-create-update-tf2pb" event={"ID":"6bdd8664-6d91-4616-8095-f44067fdca51","Type":"ContainerStarted","Data":"36e84b5ed003c240081e5afc68dd15e5fc943e783f85671eaa2c34c2afee47fd"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.757354 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"919ba171-1971-416c-99c1-5dfcacc10a28","Type":"ContainerStarted","Data":"dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.757686 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.761271 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c7ca2a9f-a42e-4d9b-89a7-f2590842f328","Type":"ContainerStarted","Data":"84e4c6c19d757fd81ef5f856104b51d9057ffe90f91b0313f39e58f7d670a984"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.762051 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.765657 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd103c67-d035-4de1-aba9-667d1eb67813","Type":"ContainerStarted","Data":"90aac18f99b1b6de59145c17a7f71bca0b8c502500fa3e1f4423e423d4f545c8"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.775997 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerStarted","Data":"490f54d4fc0654da6b5add2d9e470584271088a4fc9d0ff0972339bc97ab6f8f"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.785966 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6n9rn" event={"ID":"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b","Type":"ContainerStarted","Data":"7d3236f4301015aa89ce006050dc39e2b0704b179ed73e64d6106302850331f9"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.791135 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" event={"ID":"4e0e3c40-86af-4986-bf58-fa79ce187828","Type":"ContainerDied","Data":"5f275f16796eb838b7768e9505dd01e92ce2776a3d08fa355285fb9c70fdc93f"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.791179 4781 scope.go:117] "RemoveContainer" containerID="5e167781defb17ca0ec0a66f7db9a5b9464f7410f615b9d338fc5bcbdaa6963b" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.791278 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.792906 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-94sd2"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.803866 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535866-qpv8l"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.818144 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5255-account-create-update-k87hd"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.822545 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"58009056-4183-4017-bfa1-c14ce28b92ea","Type":"ContainerStarted","Data":"e218b31bb4894325c8b13e3e40f2780bfeb716eee793f6e19a4b09e747c5de90"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.824277 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.55299725 podStartE2EDuration="1m4.824259858s" podCreationTimestamp="2026-02-27 00:24:57 +0000 UTC" firstStartedPulling="2026-02-27 00:25:04.277396166 +0000 UTC m=+1173.534935720" lastFinishedPulling="2026-02-27 00:25:17.548658774 +0000 UTC m=+1186.806198328" observedRunningTime="2026-02-27 00:26:01.784448683 +0000 UTC m=+1231.041988237" watchObservedRunningTime="2026-02-27 00:26:01.824259858 +0000 UTC m=+1231.081799402" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.827151 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8c9b-account-create-update-d29bm" event={"ID":"bb4687ec-812e-48bb-8d53-ed628f3cd013","Type":"ContainerStarted","Data":"a52130091c9100982624c568d31dc83849096589647f47661d0debdea301a332"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.827204 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8c9b-account-create-update-d29bm" event={"ID":"bb4687ec-812e-48bb-8d53-ed628f3cd013","Type":"ContainerStarted","Data":"1c0a5f73b00e8a9559b8d8378ce7177f49256b9d4979478bb067d80bc66e07fc"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.836775 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-f66vm"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.838810 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.058098145 podStartE2EDuration="54.83878956s" podCreationTimestamp="2026-02-27 00:25:07 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.000926433 +0000 UTC m=+1188.258465997" lastFinishedPulling="2026-02-27 00:26:00.781617858 +0000 UTC m=+1230.039157412" observedRunningTime="2026-02-27 00:26:01.831117918 +0000 UTC m=+1231.088657472" watchObservedRunningTime="2026-02-27 00:26:01.83878956 +0000 UTC m=+1231.096329114" Feb 27 00:26:01 crc kubenswrapper[4781]: W0227 00:26:01.841076 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fe4edac_acb6_4906_9b3b_42b7c7a98943.slice/crio-557e493fffe4db2dcddedbe073df198bb6b870988be78491ac11e021467b67f0 WatchSource:0}: Error finding container 557e493fffe4db2dcddedbe073df198bb6b870988be78491ac11e021467b67f0: Status 404 returned error can't find the container with id 557e493fffe4db2dcddedbe073df198bb6b870988be78491ac11e021467b67f0 Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.841252 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrxqx" event={"ID":"f1713962-9458-45b2-9f28-61409b7ff581","Type":"ContainerStarted","Data":"b01d66bc253f93ef989863fc6fd69c5afb4405a98783d9e32be4f4b80ce3df36"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.841308 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrxqx" event={"ID":"f1713962-9458-45b2-9f28-61409b7ff581","Type":"ContainerStarted","Data":"0c4e303b9cbf7a244c07b330df27d0efc55bd33315e704ad0931c0431621f3c4"} Feb 27 00:26:01 crc kubenswrapper[4781]: W0227 00:26:01.857786 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cec0cd3_abcd_484c_85b8_03a44888a9b7.slice/crio-8173fa3b8afd38768e66ef999c136681772389689078482ef577e5ddcd6b877a WatchSource:0}: Error finding container 8173fa3b8afd38768e66ef999c136681772389689078482ef577e5ddcd6b877a: Status 404 returned error can't find the container with id 8173fa3b8afd38768e66ef999c136681772389689078482ef577e5ddcd6b877a Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.887279 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mmj84"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.936640 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=63.936606716 podStartE2EDuration="1m3.936606716s" podCreationTimestamp="2026-02-27 00:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:01.865462379 +0000 UTC m=+1231.123001943" watchObservedRunningTime="2026-02-27 00:26:01.936606716 +0000 UTC m=+1231.194146270" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.952602 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-8c9b-account-create-update-d29bm" podStartSLOduration=9.952579915 podStartE2EDuration="9.952579915s" podCreationTimestamp="2026-02-27 00:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:01.885492525 +0000 UTC m=+1231.143032079" watchObservedRunningTime="2026-02-27 00:26:01.952579915 +0000 UTC m=+1231.210119469" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.968062 4781 scope.go:117] "RemoveContainer" containerID="aabc0546990906abca2b35d28ee124be17d9e0ff7483abdc7c60ccee1cba5f86" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.980383 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-jrxqx" podStartSLOduration=9.980362965 podStartE2EDuration="9.980362965s" podCreationTimestamp="2026-02-27 00:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:01.923460471 +0000 UTC m=+1231.181000045" watchObservedRunningTime="2026-02-27 00:26:01.980362965 +0000 UTC m=+1231.237902519" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.989410 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-6n9rn" podStartSLOduration=2.7783356230000003 podStartE2EDuration="15.989392351s" podCreationTimestamp="2026-02-27 00:25:46 +0000 UTC" firstStartedPulling="2026-02-27 00:25:47.439319026 +0000 UTC m=+1216.696858590" lastFinishedPulling="2026-02-27 00:26:00.650375754 +0000 UTC m=+1229.907915318" observedRunningTime="2026-02-27 00:26:01.945269584 +0000 UTC m=+1231.202809138" watchObservedRunningTime="2026-02-27 00:26:01.989392351 +0000 UTC m=+1231.246931905" Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.002915 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4p8q8"] Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.014597 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4p8q8"] Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.217069 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:26:02 crc kubenswrapper[4781]: E0227 00:26:02.217270 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 00:26:02 crc kubenswrapper[4781]: E0227 00:26:02.217283 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 00:26:02 crc kubenswrapper[4781]: E0227 00:26:02.217330 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift podName:fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11 nodeName:}" failed. No retries permitted until 2026-02-27 00:26:18.217315802 +0000 UTC m=+1247.474855356 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift") pod "swift-storage-0" (UID: "fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11") : configmap "swift-ring-files" not found Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.676251 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.865862 4781 generic.go:334] "Generic (PLEG): container finished" podID="f1713962-9458-45b2-9f28-61409b7ff581" containerID="b01d66bc253f93ef989863fc6fd69c5afb4405a98783d9e32be4f4b80ce3df36" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.865946 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrxqx" event={"ID":"f1713962-9458-45b2-9f28-61409b7ff581","Type":"ContainerDied","Data":"b01d66bc253f93ef989863fc6fd69c5afb4405a98783d9e32be4f4b80ce3df36"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.908553 4781 generic.go:334] "Generic (PLEG): container finished" podID="bb4687ec-812e-48bb-8d53-ed628f3cd013" containerID="a52130091c9100982624c568d31dc83849096589647f47661d0debdea301a332" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.908688 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8c9b-account-create-update-d29bm" event={"ID":"bb4687ec-812e-48bb-8d53-ed628f3cd013","Type":"ContainerDied","Data":"a52130091c9100982624c568d31dc83849096589647f47661d0debdea301a332"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.916005 4781 generic.go:334] "Generic (PLEG): container finished" podID="2c6016e5-2641-4b82-b164-121ae822f863" containerID="b5253e8bb3200baca59ed8e598dc74eaddbc9fc4ea687d121523ff8347b4d62e" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.916122 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5255-account-create-update-k87hd" event={"ID":"2c6016e5-2641-4b82-b164-121ae822f863","Type":"ContainerDied","Data":"b5253e8bb3200baca59ed8e598dc74eaddbc9fc4ea687d121523ff8347b4d62e"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.916153 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5255-account-create-update-k87hd" event={"ID":"2c6016e5-2641-4b82-b164-121ae822f863","Type":"ContainerStarted","Data":"c5015fd63243058303378753db482675b9cd87268a9cd806c0e30c22950038da"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.918094 4781 generic.go:334] "Generic (PLEG): container finished" podID="c90ad80e-9897-4e20-b9b0-6add43c84bd0" containerID="4d55d2c6e343b6a1d3b8b47dac42837612db67ccce352ab276d326d2b146954e" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.918150 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-94sd2" event={"ID":"c90ad80e-9897-4e20-b9b0-6add43c84bd0","Type":"ContainerDied","Data":"4d55d2c6e343b6a1d3b8b47dac42837612db67ccce352ab276d326d2b146954e"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.918170 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-94sd2" event={"ID":"c90ad80e-9897-4e20-b9b0-6add43c84bd0","Type":"ContainerStarted","Data":"5b5249fbcfef480f4709ba4ec70ed903f8b332b1831523d320ff5af46111ef22"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.920505 4781 generic.go:334] "Generic (PLEG): container finished" podID="c986902c-3a54-4300-a078-2e70d305e97e" containerID="3d01f4c64b31dda5359f791eed0af9accdc107437765895fcc3cd585df0f55ae" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.920564 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mmj84" event={"ID":"c986902c-3a54-4300-a078-2e70d305e97e","Type":"ContainerDied","Data":"3d01f4c64b31dda5359f791eed0af9accdc107437765895fcc3cd585df0f55ae"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.920689 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mmj84" event={"ID":"c986902c-3a54-4300-a078-2e70d305e97e","Type":"ContainerStarted","Data":"6b8b1c057f209e42dffb4f53945e78631d479cd15aecac6c6f86368ea8a7b90e"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.923985 4781 generic.go:334] "Generic (PLEG): container finished" podID="6bdd8664-6d91-4616-8095-f44067fdca51" containerID="8fb72d9409a124bb8fa0479e75bf3cf0cd120b3aae8696f10bef9465f2261fc6" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.924052 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-547a-account-create-update-tf2pb" event={"ID":"6bdd8664-6d91-4616-8095-f44067fdca51","Type":"ContainerDied","Data":"8fb72d9409a124bb8fa0479e75bf3cf0cd120b3aae8696f10bef9465f2261fc6"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.925567 4781 generic.go:334] "Generic (PLEG): container finished" podID="0cec0cd3-abcd-484c-85b8-03a44888a9b7" containerID="74853e0dfa3329c0157368e93fb3d1251b7149a8041ea7981936c9bd91076b44" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.925659 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f66vm" event={"ID":"0cec0cd3-abcd-484c-85b8-03a44888a9b7","Type":"ContainerDied","Data":"74853e0dfa3329c0157368e93fb3d1251b7149a8041ea7981936c9bd91076b44"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.925688 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f66vm" event={"ID":"0cec0cd3-abcd-484c-85b8-03a44888a9b7","Type":"ContainerStarted","Data":"8173fa3b8afd38768e66ef999c136681772389689078482ef577e5ddcd6b877a"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.941077 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" event={"ID":"1fe4edac-acb6-4906-9b3b-42b7c7a98943","Type":"ContainerStarted","Data":"557e493fffe4db2dcddedbe073df198bb6b870988be78491ac11e021467b67f0"} Feb 27 00:26:03 crc kubenswrapper[4781]: I0227 00:26:03.322066 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" path="/var/lib/kubelet/pods/4e0e3c40-86af-4986-bf58-fa79ce187828/volumes" Feb 27 00:26:03 crc kubenswrapper[4781]: I0227 00:26:03.675956 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 27 00:26:03 crc kubenswrapper[4781]: I0227 00:26:03.944155 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerStarted","Data":"c6e860c6c62b63e5a5fe835a4877c45040a36e7fc332cce5af395a3eaa5e24b1"} Feb 27 00:26:03 crc kubenswrapper[4781]: I0227 00:26:03.949216 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" event={"ID":"1fe4edac-acb6-4906-9b3b-42b7c7a98943","Type":"ContainerStarted","Data":"28555d58f1fd114e239212917d6df64a83d89ed63bf1f65157974daf4ae101b8"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.318092 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f66vm" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.367081 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpskz\" (UniqueName: \"kubernetes.io/projected/0cec0cd3-abcd-484c-85b8-03a44888a9b7-kube-api-access-dpskz\") pod \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.367216 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cec0cd3-abcd-484c-85b8-03a44888a9b7-operator-scripts\") pod \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.370478 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cec0cd3-abcd-484c-85b8-03a44888a9b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0cec0cd3-abcd-484c-85b8-03a44888a9b7" (UID: "0cec0cd3-abcd-484c-85b8-03a44888a9b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.400849 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cec0cd3-abcd-484c-85b8-03a44888a9b7-kube-api-access-dpskz" (OuterVolumeSpecName: "kube-api-access-dpskz") pod "0cec0cd3-abcd-484c-85b8-03a44888a9b7" (UID: "0cec0cd3-abcd-484c-85b8-03a44888a9b7"). InnerVolumeSpecName "kube-api-access-dpskz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.471930 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpskz\" (UniqueName: \"kubernetes.io/projected/0cec0cd3-abcd-484c-85b8-03a44888a9b7-kube-api-access-dpskz\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.471964 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cec0cd3-abcd-484c-85b8-03a44888a9b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.597685 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-94sd2" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.612227 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrxqx" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.675195 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1713962-9458-45b2-9f28-61409b7ff581-operator-scripts\") pod \"f1713962-9458-45b2-9f28-61409b7ff581\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.675287 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chshx\" (UniqueName: \"kubernetes.io/projected/f1713962-9458-45b2-9f28-61409b7ff581-kube-api-access-chshx\") pod \"f1713962-9458-45b2-9f28-61409b7ff581\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.675370 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90ad80e-9897-4e20-b9b0-6add43c84bd0-operator-scripts\") pod \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.675423 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j978s\" (UniqueName: \"kubernetes.io/projected/c90ad80e-9897-4e20-b9b0-6add43c84bd0-kube-api-access-j978s\") pod \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.675659 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1713962-9458-45b2-9f28-61409b7ff581-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1713962-9458-45b2-9f28-61409b7ff581" (UID: "f1713962-9458-45b2-9f28-61409b7ff581"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.676114 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1713962-9458-45b2-9f28-61409b7ff581-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.676742 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c90ad80e-9897-4e20-b9b0-6add43c84bd0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c90ad80e-9897-4e20-b9b0-6add43c84bd0" (UID: "c90ad80e-9897-4e20-b9b0-6add43c84bd0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.680827 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1713962-9458-45b2-9f28-61409b7ff581-kube-api-access-chshx" (OuterVolumeSpecName: "kube-api-access-chshx") pod "f1713962-9458-45b2-9f28-61409b7ff581" (UID: "f1713962-9458-45b2-9f28-61409b7ff581"). InnerVolumeSpecName "kube-api-access-chshx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.681038 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90ad80e-9897-4e20-b9b0-6add43c84bd0-kube-api-access-j978s" (OuterVolumeSpecName: "kube-api-access-j978s") pod "c90ad80e-9897-4e20-b9b0-6add43c84bd0" (UID: "c90ad80e-9897-4e20-b9b0-6add43c84bd0"). InnerVolumeSpecName "kube-api-access-j978s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.764107 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.777945 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j978s\" (UniqueName: \"kubernetes.io/projected/c90ad80e-9897-4e20-b9b0-6add43c84bd0-kube-api-access-j978s\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.777998 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chshx\" (UniqueName: \"kubernetes.io/projected/f1713962-9458-45b2-9f28-61409b7ff581-kube-api-access-chshx\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.778012 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90ad80e-9897-4e20-b9b0-6add43c84bd0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.840577 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.879361 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6016e5-2641-4b82-b164-121ae822f863-operator-scripts\") pod \"2c6016e5-2641-4b82-b164-121ae822f863\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.879503 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfbcp\" (UniqueName: \"kubernetes.io/projected/2c6016e5-2641-4b82-b164-121ae822f863-kube-api-access-zfbcp\") pod \"2c6016e5-2641-4b82-b164-121ae822f863\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.880051 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c6016e5-2641-4b82-b164-121ae822f863-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c6016e5-2641-4b82-b164-121ae822f863" (UID: "2c6016e5-2641-4b82-b164-121ae822f863"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.880500 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6016e5-2641-4b82-b164-121ae822f863-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.886378 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c6016e5-2641-4b82-b164-121ae822f863-kube-api-access-zfbcp" (OuterVolumeSpecName: "kube-api-access-zfbcp") pod "2c6016e5-2641-4b82-b164-121ae822f863" (UID: "2c6016e5-2641-4b82-b164-121ae822f863"). InnerVolumeSpecName "kube-api-access-zfbcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.886822 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.958662 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrxqx" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.958900 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrxqx" event={"ID":"f1713962-9458-45b2-9f28-61409b7ff581","Type":"ContainerDied","Data":"0c4e303b9cbf7a244c07b330df27d0efc55bd33315e704ad0931c0431621f3c4"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.958935 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c4e303b9cbf7a244c07b330df27d0efc55bd33315e704ad0931c0431621f3c4" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.960611 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-547a-account-create-update-tf2pb" event={"ID":"6bdd8664-6d91-4616-8095-f44067fdca51","Type":"ContainerDied","Data":"36e84b5ed003c240081e5afc68dd15e5fc943e783f85671eaa2c34c2afee47fd"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.960734 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36e84b5ed003c240081e5afc68dd15e5fc943e783f85671eaa2c34c2afee47fd" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.960794 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.963030 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f66vm" event={"ID":"0cec0cd3-abcd-484c-85b8-03a44888a9b7","Type":"ContainerDied","Data":"8173fa3b8afd38768e66ef999c136681772389689078482ef577e5ddcd6b877a"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.963062 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8173fa3b8afd38768e66ef999c136681772389689078482ef577e5ddcd6b877a" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.963112 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f66vm" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.966565 4781 generic.go:334] "Generic (PLEG): container finished" podID="1fe4edac-acb6-4906-9b3b-42b7c7a98943" containerID="28555d58f1fd114e239212917d6df64a83d89ed63bf1f65157974daf4ae101b8" exitCode=0 Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.966638 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" event={"ID":"1fe4edac-acb6-4906-9b3b-42b7c7a98943","Type":"ContainerDied","Data":"28555d58f1fd114e239212917d6df64a83d89ed63bf1f65157974daf4ae101b8"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.968508 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.968519 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5255-account-create-update-k87hd" event={"ID":"2c6016e5-2641-4b82-b164-121ae822f863","Type":"ContainerDied","Data":"c5015fd63243058303378753db482675b9cd87268a9cd806c0e30c22950038da"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.968553 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5015fd63243058303378753db482675b9cd87268a9cd806c0e30c22950038da" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.970620 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"58009056-4183-4017-bfa1-c14ce28b92ea","Type":"ContainerStarted","Data":"08ef63f36bfd20f4318ef29e1e8d3879e1a5dd2fa86a7d1fbfe5e75b632a8837"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.970800 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.972039 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-94sd2" event={"ID":"c90ad80e-9897-4e20-b9b0-6add43c84bd0","Type":"ContainerDied","Data":"5b5249fbcfef480f4709ba4ec70ed903f8b332b1831523d320ff5af46111ef22"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.972070 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b5249fbcfef480f4709ba4ec70ed903f8b332b1831523d320ff5af46111ef22" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.972076 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-94sd2" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.984467 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdd8664-6d91-4616-8095-f44067fdca51-operator-scripts\") pod \"6bdd8664-6d91-4616-8095-f44067fdca51\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.984874 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bdd8664-6d91-4616-8095-f44067fdca51-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bdd8664-6d91-4616-8095-f44067fdca51" (UID: "6bdd8664-6d91-4616-8095-f44067fdca51"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.984613 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlll5\" (UniqueName: \"kubernetes.io/projected/6bdd8664-6d91-4616-8095-f44067fdca51-kube-api-access-zlll5\") pod \"6bdd8664-6d91-4616-8095-f44067fdca51\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.988143 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfbcp\" (UniqueName: \"kubernetes.io/projected/2c6016e5-2641-4b82-b164-121ae822f863-kube-api-access-zfbcp\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.988174 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdd8664-6d91-4616-8095-f44067fdca51-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.988559 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bdd8664-6d91-4616-8095-f44067fdca51-kube-api-access-zlll5" (OuterVolumeSpecName: "kube-api-access-zlll5") pod "6bdd8664-6d91-4616-8095-f44067fdca51" (UID: "6bdd8664-6d91-4616-8095-f44067fdca51"). InnerVolumeSpecName "kube-api-access-zlll5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.989443 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.007395 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=17.995921644 podStartE2EDuration="1m0.007377485s" podCreationTimestamp="2026-02-27 00:25:05 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.868418986 +0000 UTC m=+1188.125958540" lastFinishedPulling="2026-02-27 00:26:00.879874827 +0000 UTC m=+1230.137414381" observedRunningTime="2026-02-27 00:26:04.994153906 +0000 UTC m=+1234.251693460" watchObservedRunningTime="2026-02-27 00:26:05.007377485 +0000 UTC m=+1234.264917039" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.015584 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.026349 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mmj84" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.088932 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjn7t\" (UniqueName: \"kubernetes.io/projected/c986902c-3a54-4300-a078-2e70d305e97e-kube-api-access-tjn7t\") pod \"c986902c-3a54-4300-a078-2e70d305e97e\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.089055 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb4687ec-812e-48bb-8d53-ed628f3cd013-operator-scripts\") pod \"bb4687ec-812e-48bb-8d53-ed628f3cd013\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.089188 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c986902c-3a54-4300-a078-2e70d305e97e-operator-scripts\") pod \"c986902c-3a54-4300-a078-2e70d305e97e\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.089301 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2c6b\" (UniqueName: \"kubernetes.io/projected/bb4687ec-812e-48bb-8d53-ed628f3cd013-kube-api-access-l2c6b\") pod \"bb4687ec-812e-48bb-8d53-ed628f3cd013\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.089712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb4687ec-812e-48bb-8d53-ed628f3cd013-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb4687ec-812e-48bb-8d53-ed628f3cd013" (UID: "bb4687ec-812e-48bb-8d53-ed628f3cd013"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.089776 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c986902c-3a54-4300-a078-2e70d305e97e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c986902c-3a54-4300-a078-2e70d305e97e" (UID: "c986902c-3a54-4300-a078-2e70d305e97e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.093041 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c986902c-3a54-4300-a078-2e70d305e97e-kube-api-access-tjn7t" (OuterVolumeSpecName: "kube-api-access-tjn7t") pod "c986902c-3a54-4300-a078-2e70d305e97e" (UID: "c986902c-3a54-4300-a078-2e70d305e97e"). InnerVolumeSpecName "kube-api-access-tjn7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.095142 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb4687ec-812e-48bb-8d53-ed628f3cd013-kube-api-access-l2c6b" (OuterVolumeSpecName: "kube-api-access-l2c6b") pod "bb4687ec-812e-48bb-8d53-ed628f3cd013" (UID: "bb4687ec-812e-48bb-8d53-ed628f3cd013"). InnerVolumeSpecName "kube-api-access-l2c6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.095721 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlll5\" (UniqueName: \"kubernetes.io/projected/6bdd8664-6d91-4616-8095-f44067fdca51-kube-api-access-zlll5\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.095883 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjn7t\" (UniqueName: \"kubernetes.io/projected/c986902c-3a54-4300-a078-2e70d305e97e-kube-api-access-tjn7t\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.096017 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb4687ec-812e-48bb-8d53-ed628f3cd013-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.096118 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c986902c-3a54-4300-a078-2e70d305e97e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.198904 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2c6b\" (UniqueName: \"kubernetes.io/projected/bb4687ec-812e-48bb-8d53-ed628f3cd013-kube-api-access-l2c6b\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.364742 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.503391 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z28p\" (UniqueName: \"kubernetes.io/projected/1fe4edac-acb6-4906-9b3b-42b7c7a98943-kube-api-access-7z28p\") pod \"1fe4edac-acb6-4906-9b3b-42b7c7a98943\" (UID: \"1fe4edac-acb6-4906-9b3b-42b7c7a98943\") " Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.529817 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe4edac-acb6-4906-9b3b-42b7c7a98943-kube-api-access-7z28p" (OuterVolumeSpecName: "kube-api-access-7z28p") pod "1fe4edac-acb6-4906-9b3b-42b7c7a98943" (UID: "1fe4edac-acb6-4906-9b3b-42b7c7a98943"). InnerVolumeSpecName "kube-api-access-7z28p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.606821 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z28p\" (UniqueName: \"kubernetes.io/projected/1fe4edac-acb6-4906-9b3b-42b7c7a98943-kube-api-access-7z28p\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.716738 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.979938 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mmj84" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.979919 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mmj84" event={"ID":"c986902c-3a54-4300-a078-2e70d305e97e","Type":"ContainerDied","Data":"6b8b1c057f209e42dffb4f53945e78631d479cd15aecac6c6f86368ea8a7b90e"} Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.980840 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b8b1c057f209e42dffb4f53945e78631d479cd15aecac6c6f86368ea8a7b90e" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.981715 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8c9b-account-create-update-d29bm" event={"ID":"bb4687ec-812e-48bb-8d53-ed628f3cd013","Type":"ContainerDied","Data":"1c0a5f73b00e8a9559b8d8378ce7177f49256b9d4979478bb067d80bc66e07fc"} Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.981748 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c0a5f73b00e8a9559b8d8378ce7177f49256b9d4979478bb067d80bc66e07fc" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.981767 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.983099 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.983200 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" event={"ID":"1fe4edac-acb6-4906-9b3b-42b7c7a98943","Type":"ContainerDied","Data":"557e493fffe4db2dcddedbe073df198bb6b870988be78491ac11e021467b67f0"} Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.983230 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="557e493fffe4db2dcddedbe073df198bb6b870988be78491ac11e021467b67f0" Feb 27 00:26:06 crc kubenswrapper[4781]: I0227 00:26:06.108359 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:26:06 crc kubenswrapper[4781]: I0227 00:26:06.489728 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535860-d6xsb"] Feb 27 00:26:06 crc kubenswrapper[4781]: I0227 00:26:06.500124 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535860-d6xsb"] Feb 27 00:26:06 crc kubenswrapper[4781]: I0227 00:26:06.952743 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="2691e066-2f4c-4e7e-bcac-01933bd6cadb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 27 00:26:07 crc kubenswrapper[4781]: I0227 00:26:07.017371 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:26:07 crc kubenswrapper[4781]: I0227 00:26:07.319508 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8ba504f-040f-4632-b5d0-4b28aef8d27e" path="/var/lib/kubelet/pods/c8ba504f-040f-4632-b5d0-4b28aef8d27e/volumes" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.215386 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-8tmft"] Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229419 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6016e5-2641-4b82-b164-121ae822f863" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229465 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6016e5-2641-4b82-b164-121ae822f863" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229486 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90ad80e-9897-4e20-b9b0-6add43c84bd0" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229494 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90ad80e-9897-4e20-b9b0-6add43c84bd0" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229508 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="init" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229518 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="init" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229532 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bdd8664-6d91-4616-8095-f44067fdca51" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229542 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdd8664-6d91-4616-8095-f44067fdca51" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229553 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1713962-9458-45b2-9f28-61409b7ff581" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229561 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1713962-9458-45b2-9f28-61409b7ff581" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229585 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4687ec-812e-48bb-8d53-ed628f3cd013" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229592 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4687ec-812e-48bb-8d53-ed628f3cd013" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229607 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cec0cd3-abcd-484c-85b8-03a44888a9b7" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229614 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cec0cd3-abcd-484c-85b8-03a44888a9b7" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229644 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="dnsmasq-dns" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229652 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="dnsmasq-dns" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229663 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c986902c-3a54-4300-a078-2e70d305e97e" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229671 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c986902c-3a54-4300-a078-2e70d305e97e" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229680 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe4edac-acb6-4906-9b3b-42b7c7a98943" containerName="oc" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229687 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe4edac-acb6-4906-9b3b-42b7c7a98943" containerName="oc" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229951 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4687ec-812e-48bb-8d53-ed628f3cd013" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229969 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c986902c-3a54-4300-a078-2e70d305e97e" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229983 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="dnsmasq-dns" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229998 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bdd8664-6d91-4616-8095-f44067fdca51" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230008 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe4edac-acb6-4906-9b3b-42b7c7a98943" containerName="oc" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230019 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c90ad80e-9897-4e20-b9b0-6add43c84bd0" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230028 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1713962-9458-45b2-9f28-61409b7ff581" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230036 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cec0cd3-abcd-484c-85b8-03a44888a9b7" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230052 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c6016e5-2641-4b82-b164-121ae822f863" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230659 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8tmft"] Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230755 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.237551 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4ql2s" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.237935 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.351880 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-config-data\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.351976 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsrkc\" (UniqueName: \"kubernetes.io/projected/47cc3f01-6a5c-4797-bf86-25770e66e928-kube-api-access-gsrkc\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.352015 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-db-sync-config-data\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.352107 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-combined-ca-bundle\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.454039 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsrkc\" (UniqueName: \"kubernetes.io/projected/47cc3f01-6a5c-4797-bf86-25770e66e928-kube-api-access-gsrkc\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.454114 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-db-sync-config-data\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.454911 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-combined-ca-bundle\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.454985 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-config-data\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.463431 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-config-data\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.466098 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-db-sync-config-data\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.470226 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-combined-ca-bundle\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.474254 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsrkc\" (UniqueName: \"kubernetes.io/projected/47cc3f01-6a5c-4797-bf86-25770e66e928-kube-api-access-gsrkc\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.551878 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:08.738876 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:08.992152 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:08.998324 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.002825 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-f6c72" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.003137 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.003321 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.003503 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.039784 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.134538 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8tmft"] Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.170726 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.170777 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5923572-3637-49e3-9eea-72e52c5fb88b-scripts\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.170859 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.170889 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.171034 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp584\" (UniqueName: \"kubernetes.io/projected/d5923572-3637-49e3-9eea-72e52c5fb88b-kube-api-access-cp584\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.171185 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5923572-3637-49e3-9eea-72e52c5fb88b-config\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.171460 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5923572-3637-49e3-9eea-72e52c5fb88b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.272786 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.272834 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.272872 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp584\" (UniqueName: \"kubernetes.io/projected/d5923572-3637-49e3-9eea-72e52c5fb88b-kube-api-access-cp584\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.272894 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5923572-3637-49e3-9eea-72e52c5fb88b-config\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.272979 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5923572-3637-49e3-9eea-72e52c5fb88b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.273051 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.273080 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5923572-3637-49e3-9eea-72e52c5fb88b-scripts\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.274080 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5923572-3637-49e3-9eea-72e52c5fb88b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.274824 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5923572-3637-49e3-9eea-72e52c5fb88b-config\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.275366 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5923572-3637-49e3-9eea-72e52c5fb88b-scripts\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.281642 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.284925 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.284940 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.295414 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp584\" (UniqueName: \"kubernetes.io/projected/d5923572-3637-49e3-9eea-72e52c5fb88b-kube-api-access-cp584\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.335014 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 27 00:26:10 crc kubenswrapper[4781]: I0227 00:26:10.026109 4781 generic.go:334] "Generic (PLEG): container finished" podID="b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" containerID="7d3236f4301015aa89ce006050dc39e2b0704b179ed73e64d6106302850331f9" exitCode=0 Feb 27 00:26:10 crc kubenswrapper[4781]: I0227 00:26:10.026189 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6n9rn" event={"ID":"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b","Type":"ContainerDied","Data":"7d3236f4301015aa89ce006050dc39e2b0704b179ed73e64d6106302850331f9"} Feb 27 00:26:10 crc kubenswrapper[4781]: I0227 00:26:10.028271 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8tmft" event={"ID":"47cc3f01-6a5c-4797-bf86-25770e66e928","Type":"ContainerStarted","Data":"c43bdd484887a1ab19b1a74ff7e94493b840e9d2b41b9b9e8c3466f0b78cc88d"} Feb 27 00:26:10 crc kubenswrapper[4781]: I0227 00:26:10.047719 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 27 00:26:10 crc kubenswrapper[4781]: W0227 00:26:10.049579 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5923572_3637_49e3_9eea_72e52c5fb88b.slice/crio-d7cef7c0a0810b567b27259e063c6dad9a86a7be2a05f6f35d2ea8ce4f01bcb4 WatchSource:0}: Error finding container d7cef7c0a0810b567b27259e063c6dad9a86a7be2a05f6f35d2ea8ce4f01bcb4: Status 404 returned error can't find the container with id d7cef7c0a0810b567b27259e063c6dad9a86a7be2a05f6f35d2ea8ce4f01bcb4 Feb 27 00:26:10 crc kubenswrapper[4781]: I0227 00:26:10.925030 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-mmj84"] Feb 27 00:26:10 crc kubenswrapper[4781]: I0227 00:26:10.934765 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-mmj84"] Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.042547 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d5923572-3637-49e3-9eea-72e52c5fb88b","Type":"ContainerStarted","Data":"d7cef7c0a0810b567b27259e063c6dad9a86a7be2a05f6f35d2ea8ce4f01bcb4"} Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.331272 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c986902c-3a54-4300-a078-2e70d305e97e" path="/var/lib/kubelet/pods/c986902c-3a54-4300-a078-2e70d305e97e/volumes" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.469963 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620099 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6d9g\" (UniqueName: \"kubernetes.io/projected/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-kube-api-access-k6d9g\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620171 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-etc-swift\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620271 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-swiftconf\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620305 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-ring-data-devices\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620325 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-scripts\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620410 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-dispersionconf\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620429 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-combined-ca-bundle\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.621350 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.621575 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.625847 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-kube-api-access-k6d9g" (OuterVolumeSpecName: "kube-api-access-k6d9g") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "kube-api-access-k6d9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.645018 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.646897 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.647693 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.659185 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-scripts" (OuterVolumeSpecName: "scripts") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722891 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722921 4781 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722936 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722950 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6d9g\" (UniqueName: \"kubernetes.io/projected/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-kube-api-access-k6d9g\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722962 4781 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722973 4781 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722982 4781 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:12 crc kubenswrapper[4781]: I0227 00:26:12.092186 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d5923572-3637-49e3-9eea-72e52c5fb88b","Type":"ContainerStarted","Data":"27c60acd2a3e598cd4eb2f0aca7bc6776567328b53a7fd4e5925a5387dd11dae"} Feb 27 00:26:12 crc kubenswrapper[4781]: I0227 00:26:12.094867 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6n9rn" event={"ID":"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b","Type":"ContainerDied","Data":"ca26accad7ac480d16da11e818bc3769c592f1c77082ff70a7fdd81af22f0086"} Feb 27 00:26:12 crc kubenswrapper[4781]: I0227 00:26:12.094900 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca26accad7ac480d16da11e818bc3769c592f1c77082ff70a7fdd81af22f0086" Feb 27 00:26:12 crc kubenswrapper[4781]: I0227 00:26:12.094983 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.106607 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d5923572-3637-49e3-9eea-72e52c5fb88b","Type":"ContainerStarted","Data":"5e9e97c2d3c5129072f1159d516f11fdd77eb7aa658b8b7af077d3e387dfebbf"} Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.106964 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.131339 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.386301897 podStartE2EDuration="5.131317099s" podCreationTimestamp="2026-02-27 00:26:08 +0000 UTC" firstStartedPulling="2026-02-27 00:26:10.051988296 +0000 UTC m=+1239.309527840" lastFinishedPulling="2026-02-27 00:26:11.797003488 +0000 UTC m=+1241.054543042" observedRunningTime="2026-02-27 00:26:13.121978302 +0000 UTC m=+1242.379517866" watchObservedRunningTime="2026-02-27 00:26:13.131317099 +0000 UTC m=+1242.388856663" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.333425 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.339096 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.578800 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9zkpb-config-9k74f"] Feb 27 00:26:13 crc kubenswrapper[4781]: E0227 00:26:13.579577 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" containerName="swift-ring-rebalance" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.579676 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" containerName="swift-ring-rebalance" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.579950 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" containerName="swift-ring-rebalance" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.580779 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.587083 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.592786 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9zkpb-config-9k74f"] Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.676925 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-scripts\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.677369 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-additional-scripts\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.677408 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run-ovn\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.677458 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-log-ovn\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.677503 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.677546 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6wjb\" (UniqueName: \"kubernetes.io/projected/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-kube-api-access-j6wjb\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.779710 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6wjb\" (UniqueName: \"kubernetes.io/projected/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-kube-api-access-j6wjb\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.779828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-scripts\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.779930 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-additional-scripts\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.779964 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run-ovn\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.780012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-log-ovn\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.780057 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.780419 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.780487 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run-ovn\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.780536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-log-ovn\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.780822 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-additional-scripts\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.783187 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-scripts\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.798787 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6wjb\" (UniqueName: \"kubernetes.io/projected/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-kube-api-access-j6wjb\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.935815 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:15 crc kubenswrapper[4781]: I0227 00:26:15.925282 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-wxsbg"] Feb 27 00:26:15 crc kubenswrapper[4781]: I0227 00:26:15.927822 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:15 crc kubenswrapper[4781]: I0227 00:26:15.931075 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 27 00:26:15 crc kubenswrapper[4781]: I0227 00:26:15.935331 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wxsbg"] Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.028161 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cafd294d-e929-4cd5-8be3-7175ad4aed09-operator-scripts\") pod \"root-account-create-update-wxsbg\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.028372 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zhb8\" (UniqueName: \"kubernetes.io/projected/cafd294d-e929-4cd5-8be3-7175ad4aed09-kube-api-access-5zhb8\") pod \"root-account-create-update-wxsbg\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.129737 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zhb8\" (UniqueName: \"kubernetes.io/projected/cafd294d-e929-4cd5-8be3-7175ad4aed09-kube-api-access-5zhb8\") pod \"root-account-create-update-wxsbg\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.129833 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cafd294d-e929-4cd5-8be3-7175ad4aed09-operator-scripts\") pod \"root-account-create-update-wxsbg\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.130845 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cafd294d-e929-4cd5-8be3-7175ad4aed09-operator-scripts\") pod \"root-account-create-update-wxsbg\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.149703 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zhb8\" (UniqueName: \"kubernetes.io/projected/cafd294d-e929-4cd5-8be3-7175ad4aed09-kube-api-access-5zhb8\") pod \"root-account-create-update-wxsbg\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.261665 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.950965 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="2691e066-2f4c-4e7e-bcac-01933bd6cadb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 27 00:26:17 crc kubenswrapper[4781]: I0227 00:26:17.546423 4781 scope.go:117] "RemoveContainer" containerID="3eb6fa2c40c5ff8bd90c7472dc3a2b552bb7c38236a559c08d25c903e216a06b" Feb 27 00:26:18 crc kubenswrapper[4781]: I0227 00:26:18.271121 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:26:18 crc kubenswrapper[4781]: I0227 00:26:18.285535 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:26:18 crc kubenswrapper[4781]: I0227 00:26:18.567461 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.365573 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.648662 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-zvn4t"] Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.650044 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.677686 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zvn4t"] Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.764341 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.805417 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh4qm\" (UniqueName: \"kubernetes.io/projected/e8806487-486f-464d-8249-b6368daabff5-kube-api-access-qh4qm\") pod \"cinder-db-create-zvn4t\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.805506 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8806487-486f-464d-8249-b6368daabff5-operator-scripts\") pod \"cinder-db-create-zvn4t\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.808500 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6e38-account-create-update-dntk2"] Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.809944 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.812981 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.833293 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6e38-account-create-update-dntk2"] Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.864753 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-99xdp"] Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.866281 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.907405 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh4qm\" (UniqueName: \"kubernetes.io/projected/e8806487-486f-464d-8249-b6368daabff5-kube-api-access-qh4qm\") pod \"cinder-db-create-zvn4t\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.907514 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0eb55288-e9bb-46f0-bae3-789e8db036cf-operator-scripts\") pod \"cinder-6e38-account-create-update-dntk2\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.907551 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8806487-486f-464d-8249-b6368daabff5-operator-scripts\") pod \"cinder-db-create-zvn4t\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.907570 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjdvb\" (UniqueName: \"kubernetes.io/projected/0eb55288-e9bb-46f0-bae3-789e8db036cf-kube-api-access-wjdvb\") pod \"cinder-6e38-account-create-update-dntk2\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.908845 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8806487-486f-464d-8249-b6368daabff5-operator-scripts\") pod \"cinder-db-create-zvn4t\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.917512 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-99xdp"] Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.962416 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh4qm\" (UniqueName: \"kubernetes.io/projected/e8806487-486f-464d-8249-b6368daabff5-kube-api-access-qh4qm\") pod \"cinder-db-create-zvn4t\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.974106 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.008828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0eb55288-e9bb-46f0-bae3-789e8db036cf-operator-scripts\") pod \"cinder-6e38-account-create-update-dntk2\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.009352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjdvb\" (UniqueName: \"kubernetes.io/projected/0eb55288-e9bb-46f0-bae3-789e8db036cf-kube-api-access-wjdvb\") pod \"cinder-6e38-account-create-update-dntk2\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.010477 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24adb929-f812-4243-94ea-23345856d28f-operator-scripts\") pod \"cloudkitty-db-create-99xdp\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.010891 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cr8q\" (UniqueName: \"kubernetes.io/projected/24adb929-f812-4243-94ea-23345856d28f-kube-api-access-6cr8q\") pod \"cloudkitty-db-create-99xdp\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.010351 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0eb55288-e9bb-46f0-bae3-789e8db036cf-operator-scripts\") pod \"cinder-6e38-account-create-update-dntk2\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.010511 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-m5rm5"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.012362 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.039398 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjdvb\" (UniqueName: \"kubernetes.io/projected/0eb55288-e9bb-46f0-bae3-789e8db036cf-kube-api-access-wjdvb\") pod \"cinder-6e38-account-create-update-dntk2\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.051938 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-m5rm5"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.114929 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-operator-scripts\") pod \"barbican-db-create-m5rm5\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.115417 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24adb929-f812-4243-94ea-23345856d28f-operator-scripts\") pod \"cloudkitty-db-create-99xdp\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.115461 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdgbd\" (UniqueName: \"kubernetes.io/projected/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-kube-api-access-qdgbd\") pod \"barbican-db-create-m5rm5\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.115486 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cr8q\" (UniqueName: \"kubernetes.io/projected/24adb929-f812-4243-94ea-23345856d28f-kube-api-access-6cr8q\") pod \"cloudkitty-db-create-99xdp\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.116383 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24adb929-f812-4243-94ea-23345856d28f-operator-scripts\") pod \"cloudkitty-db-create-99xdp\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.138169 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cr8q\" (UniqueName: \"kubernetes.io/projected/24adb929-f812-4243-94ea-23345856d28f-kube-api-access-6cr8q\") pod \"cloudkitty-db-create-99xdp\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.160447 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.190162 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-v2g9n"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.191377 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.203407 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-rs9bx"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.204753 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.205042 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.209314 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.209482 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.209541 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nhgp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.215746 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.216872 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgbd\" (UniqueName: \"kubernetes.io/projected/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-kube-api-access-qdgbd\") pod \"barbican-db-create-m5rm5\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.216912 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-operator-scripts\") pod \"barbican-db-create-m5rm5\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.217590 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-operator-scripts\") pod \"barbican-db-create-m5rm5\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.233301 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-a05d-account-create-update-cw8zv"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.234420 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.239042 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.268801 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdgbd\" (UniqueName: \"kubernetes.io/projected/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-kube-api-access-qdgbd\") pod \"barbican-db-create-m5rm5\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.270751 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-a05d-account-create-update-cw8zv"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.312188 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-v2g9n"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.357083 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.362793 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29n8z\" (UniqueName: \"kubernetes.io/projected/58b577a3-c234-4968-a8e7-c5e629de47b1-kube-api-access-29n8z\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.362933 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfxln\" (UniqueName: \"kubernetes.io/projected/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-kube-api-access-jfxln\") pod \"neutron-db-create-v2g9n\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.362961 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmwsr\" (UniqueName: \"kubernetes.io/projected/6344c1fe-eecb-4d57-a5c7-a857e4466439-kube-api-access-rmwsr\") pod \"cloudkitty-a05d-account-create-update-cw8zv\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.362989 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6344c1fe-eecb-4d57-a5c7-a857e4466439-operator-scripts\") pod \"cloudkitty-a05d-account-create-update-cw8zv\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.363055 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-config-data\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.363089 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-combined-ca-bundle\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.363220 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-operator-scripts\") pod \"neutron-db-create-v2g9n\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.418975 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rs9bx"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.461794 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a8a8-account-create-update-vcwwx"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.463313 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469148 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469142 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-operator-scripts\") pod \"neutron-db-create-v2g9n\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469456 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29n8z\" (UniqueName: \"kubernetes.io/projected/58b577a3-c234-4968-a8e7-c5e629de47b1-kube-api-access-29n8z\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469706 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfxln\" (UniqueName: \"kubernetes.io/projected/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-kube-api-access-jfxln\") pod \"neutron-db-create-v2g9n\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469732 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmwsr\" (UniqueName: \"kubernetes.io/projected/6344c1fe-eecb-4d57-a5c7-a857e4466439-kube-api-access-rmwsr\") pod \"cloudkitty-a05d-account-create-update-cw8zv\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469757 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6344c1fe-eecb-4d57-a5c7-a857e4466439-operator-scripts\") pod \"cloudkitty-a05d-account-create-update-cw8zv\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469769 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-operator-scripts\") pod \"neutron-db-create-v2g9n\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-config-data\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469916 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-combined-ca-bundle\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.471449 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6344c1fe-eecb-4d57-a5c7-a857e4466439-operator-scripts\") pod \"cloudkitty-a05d-account-create-update-cw8zv\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.473144 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-combined-ca-bundle\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.476917 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-config-data\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.485718 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a8a8-account-create-update-vcwwx"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.491651 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29n8z\" (UniqueName: \"kubernetes.io/projected/58b577a3-c234-4968-a8e7-c5e629de47b1-kube-api-access-29n8z\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.492810 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmwsr\" (UniqueName: \"kubernetes.io/projected/6344c1fe-eecb-4d57-a5c7-a857e4466439-kube-api-access-rmwsr\") pod \"cloudkitty-a05d-account-create-update-cw8zv\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.495256 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfxln\" (UniqueName: \"kubernetes.io/projected/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-kube-api-access-jfxln\") pod \"neutron-db-create-v2g9n\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.496480 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4bde-account-create-update-fpg2t"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.497698 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.499743 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.504753 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4bde-account-create-update-fpg2t"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.534006 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.571210 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-operator-scripts\") pod \"neutron-4bde-account-create-update-fpg2t\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.571295 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfkxw\" (UniqueName: \"kubernetes.io/projected/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-kube-api-access-dfkxw\") pod \"neutron-4bde-account-create-update-fpg2t\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.571322 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/388be198-b438-4142-8fb8-ec9831e9a1af-operator-scripts\") pod \"barbican-a8a8-account-create-update-vcwwx\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.571339 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4dsb\" (UniqueName: \"kubernetes.io/projected/388be198-b438-4142-8fb8-ec9831e9a1af-kube-api-access-p4dsb\") pod \"barbican-a8a8-account-create-update-vcwwx\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.647573 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.667006 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.673541 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfkxw\" (UniqueName: \"kubernetes.io/projected/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-kube-api-access-dfkxw\") pod \"neutron-4bde-account-create-update-fpg2t\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.673587 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4dsb\" (UniqueName: \"kubernetes.io/projected/388be198-b438-4142-8fb8-ec9831e9a1af-kube-api-access-p4dsb\") pod \"barbican-a8a8-account-create-update-vcwwx\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.673613 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/388be198-b438-4142-8fb8-ec9831e9a1af-operator-scripts\") pod \"barbican-a8a8-account-create-update-vcwwx\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.673823 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-operator-scripts\") pod \"neutron-4bde-account-create-update-fpg2t\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.674365 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/388be198-b438-4142-8fb8-ec9831e9a1af-operator-scripts\") pod \"barbican-a8a8-account-create-update-vcwwx\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.674505 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-operator-scripts\") pod \"neutron-4bde-account-create-update-fpg2t\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.689434 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfkxw\" (UniqueName: \"kubernetes.io/projected/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-kube-api-access-dfkxw\") pod \"neutron-4bde-account-create-update-fpg2t\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.690072 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4dsb\" (UniqueName: \"kubernetes.io/projected/388be198-b438-4142-8fb8-ec9831e9a1af-kube-api-access-p4dsb\") pod \"barbican-a8a8-account-create-update-vcwwx\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.817457 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.826832 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.231084 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8tmft" event={"ID":"47cc3f01-6a5c-4797-bf86-25770e66e928","Type":"ContainerStarted","Data":"e75379ab5c604b926c8da8b4e1bc70d938265b4b81cac412dc92c66988d11e4a"} Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.236571 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerStarted","Data":"0d295c8666e863d2c0e4e0d3a3e33356c58f61c54e944f8ced4d911133124bc0"} Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.243998 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-a05d-account-create-update-cw8zv"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.258245 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rs9bx"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.282362 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-8tmft" podStartSLOduration=2.179377493 podStartE2EDuration="17.282340502s" podCreationTimestamp="2026-02-27 00:26:08 +0000 UTC" firstStartedPulling="2026-02-27 00:26:09.150880861 +0000 UTC m=+1238.408420415" lastFinishedPulling="2026-02-27 00:26:24.25384387 +0000 UTC m=+1253.511383424" observedRunningTime="2026-02-27 00:26:25.271422723 +0000 UTC m=+1254.528962277" watchObservedRunningTime="2026-02-27 00:26:25.282340502 +0000 UTC m=+1254.539880056" Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.320387 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:26:25 crc kubenswrapper[4781]: W0227 00:26:25.327253 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod119c35de_5e7a_4d3f_af8a_3595d7dc69aa.slice/crio-8c83495fdbbabbe88d3f2dcacc03d6929c80894b6bc99aa2756ea3f2ad7bcd7b WatchSource:0}: Error finding container 8c83495fdbbabbe88d3f2dcacc03d6929c80894b6bc99aa2756ea3f2ad7bcd7b: Status 404 returned error can't find the container with id 8c83495fdbbabbe88d3f2dcacc03d6929c80894b6bc99aa2756ea3f2ad7bcd7b Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.347265 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9zkpb-config-9k74f"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.355676 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-v2g9n"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.362619 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-99xdp"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.411115 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a8a8-account-create-update-vcwwx"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.430201 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wxsbg"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.431812 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.120108302 podStartE2EDuration="1m21.43179206s" podCreationTimestamp="2026-02-27 00:25:04 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.874938737 +0000 UTC m=+1188.132478291" lastFinishedPulling="2026-02-27 00:26:24.186622495 +0000 UTC m=+1253.444162049" observedRunningTime="2026-02-27 00:26:25.328933733 +0000 UTC m=+1254.586473287" watchObservedRunningTime="2026-02-27 00:26:25.43179206 +0000 UTC m=+1254.689331614" Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.447864 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-m5rm5"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.454688 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4bde-account-create-update-fpg2t"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.461091 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zvn4t"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.468242 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6e38-account-create-update-dntk2"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.475788 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.098091 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.252250 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-v2g9n" event={"ID":"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9","Type":"ContainerStarted","Data":"c1465b73a1df33b94300981b2d1ed1143dd7203d14e97be01d951e1a43d63b4b"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.252485 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-v2g9n" event={"ID":"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9","Type":"ContainerStarted","Data":"30f6389e339c637a76ed83badeaae69166aa47b033d4e6235f9518305d3ee600"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.255130 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"94beb6abb1958b96717d500a6631fce3acfe4486c10c8cc84b786a985608d0c9"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.258036 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" event={"ID":"6344c1fe-eecb-4d57-a5c7-a857e4466439","Type":"ContainerDied","Data":"6d76d1e8767f2bf9f86c0f509bcf89309b39540bcf16a94f15017d9639753143"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.264921 4781 generic.go:334] "Generic (PLEG): container finished" podID="6344c1fe-eecb-4d57-a5c7-a857e4466439" containerID="6d76d1e8767f2bf9f86c0f509bcf89309b39540bcf16a94f15017d9639753143" exitCode=0 Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.265057 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" event={"ID":"6344c1fe-eecb-4d57-a5c7-a857e4466439","Type":"ContainerStarted","Data":"3156fc4e82e446fff06ff8e16e3fad71473d705c6d619f1c07498df33a7e7f1a"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.271606 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-v2g9n" podStartSLOduration=6.271591056 podStartE2EDuration="6.271591056s" podCreationTimestamp="2026-02-27 00:26:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:26.268112854 +0000 UTC m=+1255.525652408" watchObservedRunningTime="2026-02-27 00:26:26.271591056 +0000 UTC m=+1255.529130610" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.275801 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zvn4t" event={"ID":"e8806487-486f-464d-8249-b6368daabff5","Type":"ContainerStarted","Data":"08f09b8baf0d256e75e4f2cea8a8050728aa867b805093cf4bae153a92736b36"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.275840 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zvn4t" event={"ID":"e8806487-486f-464d-8249-b6368daabff5","Type":"ContainerStarted","Data":"acca40850a65ffa395f9bc3d270f6d4791d803b54b4d4bdd03263ee754d9ce94"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.283346 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4bde-account-create-update-fpg2t" event={"ID":"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf","Type":"ContainerStarted","Data":"d19d827d09664d0dd3483609af04ecbb9a2549b9335d9da322a84e9180f2130b"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.283389 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4bde-account-create-update-fpg2t" event={"ID":"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf","Type":"ContainerStarted","Data":"12fea8b77cbfdc1c7a64b2da8721e9a6a6590ac1e465f6d0c6156cdbd111bf81"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.286675 4781 generic.go:334] "Generic (PLEG): container finished" podID="24adb929-f812-4243-94ea-23345856d28f" containerID="9fc8ab8561670a45356ed0c0f51ff964f3556019e4a98628e764c0be8c981d4c" exitCode=0 Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.286726 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-99xdp" event={"ID":"24adb929-f812-4243-94ea-23345856d28f","Type":"ContainerDied","Data":"9fc8ab8561670a45356ed0c0f51ff964f3556019e4a98628e764c0be8c981d4c"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.286749 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-99xdp" event={"ID":"24adb929-f812-4243-94ea-23345856d28f","Type":"ContainerStarted","Data":"afb6fce1583a2499e12761dfb2a4e40745be9e1f5459d6226e780d1b7f1a01e4"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.292327 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-9k74f" event={"ID":"119c35de-5e7a-4d3f-af8a-3595d7dc69aa","Type":"ContainerStarted","Data":"beeaff089c6577afca77da55c908132f8c47a3993cf1d2011eea873db182b172"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.292377 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-9k74f" event={"ID":"119c35de-5e7a-4d3f-af8a-3595d7dc69aa","Type":"ContainerStarted","Data":"8c83495fdbbabbe88d3f2dcacc03d6929c80894b6bc99aa2756ea3f2ad7bcd7b"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.297852 4781 generic.go:334] "Generic (PLEG): container finished" podID="388be198-b438-4142-8fb8-ec9831e9a1af" containerID="f6f1fd0f3e8826d700e5044d1fe1b6b827695311ff2f847e95e5ba49a2863393" exitCode=0 Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.297899 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a8a8-account-create-update-vcwwx" event={"ID":"388be198-b438-4142-8fb8-ec9831e9a1af","Type":"ContainerDied","Data":"f6f1fd0f3e8826d700e5044d1fe1b6b827695311ff2f847e95e5ba49a2863393"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.297949 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a8a8-account-create-update-vcwwx" event={"ID":"388be198-b438-4142-8fb8-ec9831e9a1af","Type":"ContainerStarted","Data":"42d459e8a3a5d054ca75741044d1df38f0502d5c39db8635a77da335cae8d852"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.301668 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rs9bx" event={"ID":"58b577a3-c234-4968-a8e7-c5e629de47b1","Type":"ContainerStarted","Data":"9adcda9f33abe1db2735efdad7642538c67bab67f10201630c502ea9fc7b9c52"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.303220 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-zvn4t" podStartSLOduration=7.303199271 podStartE2EDuration="7.303199271s" podCreationTimestamp="2026-02-27 00:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:26.295797655 +0000 UTC m=+1255.553337209" watchObservedRunningTime="2026-02-27 00:26:26.303199271 +0000 UTC m=+1255.560738825" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.304780 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-m5rm5" event={"ID":"e3aedfe4-2bbb-46c9-97d4-8d6782c44707","Type":"ContainerStarted","Data":"6dace96637328dc4640d3549a1c802cf99efe23b4ad5c291813668a60dc8b49e"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.304813 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-m5rm5" event={"ID":"e3aedfe4-2bbb-46c9-97d4-8d6782c44707","Type":"ContainerStarted","Data":"4306df6b2bf7f7647c5412693ad76ea23e53776c7458f39ce7d1dce54668a6a1"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.306184 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wxsbg" event={"ID":"cafd294d-e929-4cd5-8be3-7175ad4aed09","Type":"ContainerStarted","Data":"6743d7b0c9868a62aac9ecae7e0ec57bc1eee6923be88c6054b55ea63c96129c"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.306208 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wxsbg" event={"ID":"cafd294d-e929-4cd5-8be3-7175ad4aed09","Type":"ContainerStarted","Data":"d0d777854ef43da96dbfe08c3e3579fc6ad7e045141adca768cf705a1bf88479"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.316749 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-4bde-account-create-update-fpg2t" podStartSLOduration=6.316726738 podStartE2EDuration="6.316726738s" podCreationTimestamp="2026-02-27 00:26:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:26.316726248 +0000 UTC m=+1255.574265802" watchObservedRunningTime="2026-02-27 00:26:26.316726738 +0000 UTC m=+1255.574266292" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.325197 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6e38-account-create-update-dntk2" event={"ID":"0eb55288-e9bb-46f0-bae3-789e8db036cf","Type":"ContainerStarted","Data":"297b6944b15c3822e081c593733409a3c29b72246756946b04eaf97a2a16c5d2"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.325231 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6e38-account-create-update-dntk2" event={"ID":"0eb55288-e9bb-46f0-bae3-789e8db036cf","Type":"ContainerStarted","Data":"d82ba0a5a4bcff589056e0ce6141c47f60a6cc39ed51e693d6fad4b70237438f"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.337308 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9zkpb-config-9k74f" podStartSLOduration=13.337293591 podStartE2EDuration="13.337293591s" podCreationTimestamp="2026-02-27 00:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:26.335400431 +0000 UTC m=+1255.592939985" watchObservedRunningTime="2026-02-27 00:26:26.337293591 +0000 UTC m=+1255.594833145" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.371850 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-wxsbg" podStartSLOduration=11.371832254 podStartE2EDuration="11.371832254s" podCreationTimestamp="2026-02-27 00:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:26.369011479 +0000 UTC m=+1255.626551033" watchObservedRunningTime="2026-02-27 00:26:26.371832254 +0000 UTC m=+1255.629371808" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.411144 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-m5rm5" podStartSLOduration=7.411127672 podStartE2EDuration="7.411127672s" podCreationTimestamp="2026-02-27 00:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:26.405253427 +0000 UTC m=+1255.662792981" watchObservedRunningTime="2026-02-27 00:26:26.411127672 +0000 UTC m=+1255.668667226" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.964679 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="2691e066-2f4c-4e7e-bcac-01933bd6cadb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.339911 4781 generic.go:334] "Generic (PLEG): container finished" podID="e8806487-486f-464d-8249-b6368daabff5" containerID="08f09b8baf0d256e75e4f2cea8a8050728aa867b805093cf4bae153a92736b36" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.340002 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zvn4t" event={"ID":"e8806487-486f-464d-8249-b6368daabff5","Type":"ContainerDied","Data":"08f09b8baf0d256e75e4f2cea8a8050728aa867b805093cf4bae153a92736b36"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.341830 4781 generic.go:334] "Generic (PLEG): container finished" podID="5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" containerID="d19d827d09664d0dd3483609af04ecbb9a2549b9335d9da322a84e9180f2130b" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.341894 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4bde-account-create-update-fpg2t" event={"ID":"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf","Type":"ContainerDied","Data":"d19d827d09664d0dd3483609af04ecbb9a2549b9335d9da322a84e9180f2130b"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.343697 4781 generic.go:334] "Generic (PLEG): container finished" podID="0eb55288-e9bb-46f0-bae3-789e8db036cf" containerID="297b6944b15c3822e081c593733409a3c29b72246756946b04eaf97a2a16c5d2" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.343752 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6e38-account-create-update-dntk2" event={"ID":"0eb55288-e9bb-46f0-bae3-789e8db036cf","Type":"ContainerDied","Data":"297b6944b15c3822e081c593733409a3c29b72246756946b04eaf97a2a16c5d2"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.346172 4781 generic.go:334] "Generic (PLEG): container finished" podID="cafd294d-e929-4cd5-8be3-7175ad4aed09" containerID="6743d7b0c9868a62aac9ecae7e0ec57bc1eee6923be88c6054b55ea63c96129c" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.346241 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wxsbg" event={"ID":"cafd294d-e929-4cd5-8be3-7175ad4aed09","Type":"ContainerDied","Data":"6743d7b0c9868a62aac9ecae7e0ec57bc1eee6923be88c6054b55ea63c96129c"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.348324 4781 generic.go:334] "Generic (PLEG): container finished" podID="3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" containerID="c1465b73a1df33b94300981b2d1ed1143dd7203d14e97be01d951e1a43d63b4b" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.348383 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-v2g9n" event={"ID":"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9","Type":"ContainerDied","Data":"c1465b73a1df33b94300981b2d1ed1143dd7203d14e97be01d951e1a43d63b4b"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.365505 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"f853867bfa4e786c83dc2205099ab25248d1c85dc15de999b04de22c2ab4daf6"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.370366 4781 generic.go:334] "Generic (PLEG): container finished" podID="119c35de-5e7a-4d3f-af8a-3595d7dc69aa" containerID="beeaff089c6577afca77da55c908132f8c47a3993cf1d2011eea873db182b172" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.370459 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-9k74f" event={"ID":"119c35de-5e7a-4d3f-af8a-3595d7dc69aa","Type":"ContainerDied","Data":"beeaff089c6577afca77da55c908132f8c47a3993cf1d2011eea873db182b172"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.376900 4781 generic.go:334] "Generic (PLEG): container finished" podID="e3aedfe4-2bbb-46c9-97d4-8d6782c44707" containerID="6dace96637328dc4640d3549a1c802cf99efe23b4ad5c291813668a60dc8b49e" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.377120 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-m5rm5" event={"ID":"e3aedfe4-2bbb-46c9-97d4-8d6782c44707","Type":"ContainerDied","Data":"6dace96637328dc4640d3549a1c802cf99efe23b4ad5c291813668a60dc8b49e"} Feb 27 00:26:28 crc kubenswrapper[4781]: I0227 00:26:28.389994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"f1a98806257acaa19b4b9b86ece8903557df5c930cd956c6c9c8e2bb9dcd294c"} Feb 27 00:26:28 crc kubenswrapper[4781]: I0227 00:26:28.390425 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"4cc9e5526c315ad1df3a3d8a30dc027e8f03c4458e5f81b28f262a635d116fb9"} Feb 27 00:26:28 crc kubenswrapper[4781]: I0227 00:26:28.412226 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9zkpb" Feb 27 00:26:29 crc kubenswrapper[4781]: I0227 00:26:29.415907 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.350105 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.378038 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.440069 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.465019 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a8a8-account-create-update-vcwwx" event={"ID":"388be198-b438-4142-8fb8-ec9831e9a1af","Type":"ContainerDied","Data":"42d459e8a3a5d054ca75741044d1df38f0502d5c39db8635a77da335cae8d852"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.465056 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42d459e8a3a5d054ca75741044d1df38f0502d5c39db8635a77da335cae8d852" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.473831 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.474016 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-m5rm5" event={"ID":"e3aedfe4-2bbb-46c9-97d4-8d6782c44707","Type":"ContainerDied","Data":"4306df6b2bf7f7647c5412693ad76ea23e53776c7458f39ce7d1dce54668a6a1"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.474044 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4306df6b2bf7f7647c5412693ad76ea23e53776c7458f39ce7d1dce54668a6a1" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.479190 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4bde-account-create-update-fpg2t" event={"ID":"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf","Type":"ContainerDied","Data":"12fea8b77cbfdc1c7a64b2da8721e9a6a6590ac1e465f6d0c6156cdbd111bf81"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.479238 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12fea8b77cbfdc1c7a64b2da8721e9a6a6590ac1e465f6d0c6156cdbd111bf81" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.479270 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.489960 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6e38-account-create-update-dntk2" event={"ID":"0eb55288-e9bb-46f0-bae3-789e8db036cf","Type":"ContainerDied","Data":"d82ba0a5a4bcff589056e0ce6141c47f60a6cc39ed51e693d6fad4b70237438f"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.490005 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d82ba0a5a4bcff589056e0ce6141c47f60a6cc39ed51e693d6fad4b70237438f" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.490203 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508538 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0eb55288-e9bb-46f0-bae3-789e8db036cf-operator-scripts\") pod \"0eb55288-e9bb-46f0-bae3-789e8db036cf\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508589 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-scripts\") pod \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508615 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cafd294d-e929-4cd5-8be3-7175ad4aed09-operator-scripts\") pod \"cafd294d-e929-4cd5-8be3-7175ad4aed09\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508677 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run-ovn\") pod \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508719 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfxln\" (UniqueName: \"kubernetes.io/projected/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-kube-api-access-jfxln\") pod \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508748 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjdvb\" (UniqueName: \"kubernetes.io/projected/0eb55288-e9bb-46f0-bae3-789e8db036cf-kube-api-access-wjdvb\") pod \"0eb55288-e9bb-46f0-bae3-789e8db036cf\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508789 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-additional-scripts\") pod \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508814 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-operator-scripts\") pod \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508839 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zhb8\" (UniqueName: \"kubernetes.io/projected/cafd294d-e929-4cd5-8be3-7175ad4aed09-kube-api-access-5zhb8\") pod \"cafd294d-e929-4cd5-8be3-7175ad4aed09\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508912 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-log-ovn\") pod \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508942 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfkxw\" (UniqueName: \"kubernetes.io/projected/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-kube-api-access-dfkxw\") pod \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508969 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6wjb\" (UniqueName: \"kubernetes.io/projected/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-kube-api-access-j6wjb\") pod \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.509037 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run\") pod \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.509069 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-operator-scripts\") pod \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.510201 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run" (OuterVolumeSpecName: "var-run") pod "119c35de-5e7a-4d3f-af8a-3595d7dc69aa" (UID: "119c35de-5e7a-4d3f-af8a-3595d7dc69aa"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.510381 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "119c35de-5e7a-4d3f-af8a-3595d7dc69aa" (UID: "119c35de-5e7a-4d3f-af8a-3595d7dc69aa"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.510899 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "119c35de-5e7a-4d3f-af8a-3595d7dc69aa" (UID: "119c35de-5e7a-4d3f-af8a-3595d7dc69aa"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.510996 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" (UID: "3ae26ad0-3770-4153-a1d6-96ae3a9e36a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.511074 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-scripts" (OuterVolumeSpecName: "scripts") pod "119c35de-5e7a-4d3f-af8a-3595d7dc69aa" (UID: "119c35de-5e7a-4d3f-af8a-3595d7dc69aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.511326 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb55288-e9bb-46f0-bae3-789e8db036cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0eb55288-e9bb-46f0-bae3-789e8db036cf" (UID: "0eb55288-e9bb-46f0-bae3-789e8db036cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.511508 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.511807 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cafd294d-e929-4cd5-8be3-7175ad4aed09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cafd294d-e929-4cd5-8be3-7175ad4aed09" (UID: "cafd294d-e929-4cd5-8be3-7175ad4aed09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.512446 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "119c35de-5e7a-4d3f-af8a-3595d7dc69aa" (UID: "119c35de-5e7a-4d3f-af8a-3595d7dc69aa"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.518795 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb55288-e9bb-46f0-bae3-789e8db036cf-kube-api-access-wjdvb" (OuterVolumeSpecName: "kube-api-access-wjdvb") pod "0eb55288-e9bb-46f0-bae3-789e8db036cf" (UID: "0eb55288-e9bb-46f0-bae3-789e8db036cf"). InnerVolumeSpecName "kube-api-access-wjdvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.522810 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-kube-api-access-j6wjb" (OuterVolumeSpecName: "kube-api-access-j6wjb") pod "119c35de-5e7a-4d3f-af8a-3595d7dc69aa" (UID: "119c35de-5e7a-4d3f-af8a-3595d7dc69aa"). InnerVolumeSpecName "kube-api-access-j6wjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.523324 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cafd294d-e929-4cd5-8be3-7175ad4aed09-kube-api-access-5zhb8" (OuterVolumeSpecName: "kube-api-access-5zhb8") pod "cafd294d-e929-4cd5-8be3-7175ad4aed09" (UID: "cafd294d-e929-4cd5-8be3-7175ad4aed09"). InnerVolumeSpecName "kube-api-access-5zhb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.524552 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-kube-api-access-dfkxw" (OuterVolumeSpecName: "kube-api-access-dfkxw") pod "5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" (UID: "5b9cc074-4ea1-4c04-9398-5be68fbcd5cf"). InnerVolumeSpecName "kube-api-access-dfkxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.524969 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wxsbg" event={"ID":"cafd294d-e929-4cd5-8be3-7175ad4aed09","Type":"ContainerDied","Data":"d0d777854ef43da96dbfe08c3e3579fc6ad7e045141adca768cf705a1bf88479"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.525021 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0d777854ef43da96dbfe08c3e3579fc6ad7e045141adca768cf705a1bf88479" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.525198 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.525809 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" (UID: "5b9cc074-4ea1-4c04-9398-5be68fbcd5cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.544826 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-kube-api-access-jfxln" (OuterVolumeSpecName: "kube-api-access-jfxln") pod "3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" (UID: "3ae26ad0-3770-4153-a1d6-96ae3a9e36a9"). InnerVolumeSpecName "kube-api-access-jfxln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.545098 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-v2g9n" event={"ID":"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9","Type":"ContainerDied","Data":"30f6389e339c637a76ed83badeaae69166aa47b033d4e6235f9518305d3ee600"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.545129 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30f6389e339c637a76ed83badeaae69166aa47b033d4e6235f9518305d3ee600" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.545239 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.545293 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.558959 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-9k74f" event={"ID":"119c35de-5e7a-4d3f-af8a-3595d7dc69aa","Type":"ContainerDied","Data":"8c83495fdbbabbe88d3f2dcacc03d6929c80894b6bc99aa2756ea3f2ad7bcd7b"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.559011 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c83495fdbbabbe88d3f2dcacc03d6929c80894b6bc99aa2756ea3f2ad7bcd7b" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.559092 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610076 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/388be198-b438-4142-8fb8-ec9831e9a1af-operator-scripts\") pod \"388be198-b438-4142-8fb8-ec9831e9a1af\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610422 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cr8q\" (UniqueName: \"kubernetes.io/projected/24adb929-f812-4243-94ea-23345856d28f-kube-api-access-6cr8q\") pod \"24adb929-f812-4243-94ea-23345856d28f\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610458 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4dsb\" (UniqueName: \"kubernetes.io/projected/388be198-b438-4142-8fb8-ec9831e9a1af-kube-api-access-p4dsb\") pod \"388be198-b438-4142-8fb8-ec9831e9a1af\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610496 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24adb929-f812-4243-94ea-23345856d28f-operator-scripts\") pod \"24adb929-f812-4243-94ea-23345856d28f\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610776 4781 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610794 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfkxw\" (UniqueName: \"kubernetes.io/projected/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-kube-api-access-dfkxw\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610804 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6wjb\" (UniqueName: \"kubernetes.io/projected/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-kube-api-access-j6wjb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610813 4781 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610821 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610829 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0eb55288-e9bb-46f0-bae3-789e8db036cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610837 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610844 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cafd294d-e929-4cd5-8be3-7175ad4aed09-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610852 4781 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610861 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfxln\" (UniqueName: \"kubernetes.io/projected/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-kube-api-access-jfxln\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610871 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjdvb\" (UniqueName: \"kubernetes.io/projected/0eb55288-e9bb-46f0-bae3-789e8db036cf-kube-api-access-wjdvb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610878 4781 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610887 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610896 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zhb8\" (UniqueName: \"kubernetes.io/projected/cafd294d-e929-4cd5-8be3-7175ad4aed09-kube-api-access-5zhb8\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.611260 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24adb929-f812-4243-94ea-23345856d28f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24adb929-f812-4243-94ea-23345856d28f" (UID: "24adb929-f812-4243-94ea-23345856d28f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.612891 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" event={"ID":"6344c1fe-eecb-4d57-a5c7-a857e4466439","Type":"ContainerDied","Data":"3156fc4e82e446fff06ff8e16e3fad71473d705c6d619f1c07498df33a7e7f1a"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.612930 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3156fc4e82e446fff06ff8e16e3fad71473d705c6d619f1c07498df33a7e7f1a" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.614392 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24adb929-f812-4243-94ea-23345856d28f-kube-api-access-6cr8q" (OuterVolumeSpecName: "kube-api-access-6cr8q") pod "24adb929-f812-4243-94ea-23345856d28f" (UID: "24adb929-f812-4243-94ea-23345856d28f"). InnerVolumeSpecName "kube-api-access-6cr8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.614963 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/388be198-b438-4142-8fb8-ec9831e9a1af-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "388be198-b438-4142-8fb8-ec9831e9a1af" (UID: "388be198-b438-4142-8fb8-ec9831e9a1af"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.620897 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388be198-b438-4142-8fb8-ec9831e9a1af-kube-api-access-p4dsb" (OuterVolumeSpecName: "kube-api-access-p4dsb") pod "388be198-b438-4142-8fb8-ec9831e9a1af" (UID: "388be198-b438-4142-8fb8-ec9831e9a1af"). InnerVolumeSpecName "kube-api-access-p4dsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.627068 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zvn4t" event={"ID":"e8806487-486f-464d-8249-b6368daabff5","Type":"ContainerDied","Data":"acca40850a65ffa395f9bc3d270f6d4791d803b54b4d4bdd03263ee754d9ce94"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.627114 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acca40850a65ffa395f9bc3d270f6d4791d803b54b4d4bdd03263ee754d9ce94" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.648879 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-99xdp" event={"ID":"24adb929-f812-4243-94ea-23345856d28f","Type":"ContainerDied","Data":"afb6fce1583a2499e12761dfb2a4e40745be9e1f5459d6226e780d1b7f1a01e4"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.648917 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afb6fce1583a2499e12761dfb2a4e40745be9e1f5459d6226e780d1b7f1a01e4" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.648977 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.684831 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.687219 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.696511 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.713130 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/388be198-b438-4142-8fb8-ec9831e9a1af-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.713240 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cr8q\" (UniqueName: \"kubernetes.io/projected/24adb929-f812-4243-94ea-23345856d28f-kube-api-access-6cr8q\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.713267 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4dsb\" (UniqueName: \"kubernetes.io/projected/388be198-b438-4142-8fb8-ec9831e9a1af-kube-api-access-p4dsb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.713284 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24adb929-f812-4243-94ea-23345856d28f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.815565 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh4qm\" (UniqueName: \"kubernetes.io/projected/e8806487-486f-464d-8249-b6368daabff5-kube-api-access-qh4qm\") pod \"e8806487-486f-464d-8249-b6368daabff5\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.815650 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmwsr\" (UniqueName: \"kubernetes.io/projected/6344c1fe-eecb-4d57-a5c7-a857e4466439-kube-api-access-rmwsr\") pod \"6344c1fe-eecb-4d57-a5c7-a857e4466439\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.815819 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8806487-486f-464d-8249-b6368daabff5-operator-scripts\") pod \"e8806487-486f-464d-8249-b6368daabff5\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.815864 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6344c1fe-eecb-4d57-a5c7-a857e4466439-operator-scripts\") pod \"6344c1fe-eecb-4d57-a5c7-a857e4466439\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.815900 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-operator-scripts\") pod \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.815977 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdgbd\" (UniqueName: \"kubernetes.io/projected/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-kube-api-access-qdgbd\") pod \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.816677 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8806487-486f-464d-8249-b6368daabff5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8806487-486f-464d-8249-b6368daabff5" (UID: "e8806487-486f-464d-8249-b6368daabff5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.819946 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3aedfe4-2bbb-46c9-97d4-8d6782c44707" (UID: "e3aedfe4-2bbb-46c9-97d4-8d6782c44707"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.824544 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6344c1fe-eecb-4d57-a5c7-a857e4466439-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6344c1fe-eecb-4d57-a5c7-a857e4466439" (UID: "6344c1fe-eecb-4d57-a5c7-a857e4466439"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.824697 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6344c1fe-eecb-4d57-a5c7-a857e4466439-kube-api-access-rmwsr" (OuterVolumeSpecName: "kube-api-access-rmwsr") pod "6344c1fe-eecb-4d57-a5c7-a857e4466439" (UID: "6344c1fe-eecb-4d57-a5c7-a857e4466439"). InnerVolumeSpecName "kube-api-access-rmwsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.835894 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8806487-486f-464d-8249-b6368daabff5-kube-api-access-qh4qm" (OuterVolumeSpecName: "kube-api-access-qh4qm") pod "e8806487-486f-464d-8249-b6368daabff5" (UID: "e8806487-486f-464d-8249-b6368daabff5"). InnerVolumeSpecName "kube-api-access-qh4qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.848472 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-kube-api-access-qdgbd" (OuterVolumeSpecName: "kube-api-access-qdgbd") pod "e3aedfe4-2bbb-46c9-97d4-8d6782c44707" (UID: "e3aedfe4-2bbb-46c9-97d4-8d6782c44707"). InnerVolumeSpecName "kube-api-access-qdgbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.917591 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8806487-486f-464d-8249-b6368daabff5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.917645 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6344c1fe-eecb-4d57-a5c7-a857e4466439-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.917659 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.917669 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdgbd\" (UniqueName: \"kubernetes.io/projected/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-kube-api-access-qdgbd\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.917680 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh4qm\" (UniqueName: \"kubernetes.io/projected/e8806487-486f-464d-8249-b6368daabff5-kube-api-access-qh4qm\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.917690 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmwsr\" (UniqueName: \"kubernetes.io/projected/6344c1fe-eecb-4d57-a5c7-a857e4466439-kube-api-access-rmwsr\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.672811 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rs9bx" event={"ID":"58b577a3-c234-4968-a8e7-c5e629de47b1","Type":"ContainerStarted","Data":"69da9fba4081d0816d2a2271ca344a6097bd067857fe6ffab787c65da0531cbc"} Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.680677 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.681476 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.685461 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"f123a34509614ea32220d16f3dcaae5c63f248b87dbd7f293f38c1d213478e87"} Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.685603 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.685561 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.685506 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.702608 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-rs9bx" podStartSLOduration=6.885414322 podStartE2EDuration="12.702591104s" podCreationTimestamp="2026-02-27 00:26:20 +0000 UTC" firstStartedPulling="2026-02-27 00:26:25.32014684 +0000 UTC m=+1254.577686394" lastFinishedPulling="2026-02-27 00:26:31.137323622 +0000 UTC m=+1260.394863176" observedRunningTime="2026-02-27 00:26:32.690910385 +0000 UTC m=+1261.948449939" watchObservedRunningTime="2026-02-27 00:26:32.702591104 +0000 UTC m=+1261.960130658" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.773060 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9zkpb-config-9k74f"] Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.794372 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9zkpb-config-9k74f"] Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825116 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9zkpb-config-rxxl5"] Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825466 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6344c1fe-eecb-4d57-a5c7-a857e4466439" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825483 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6344c1fe-eecb-4d57-a5c7-a857e4466439" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825499 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb55288-e9bb-46f0-bae3-789e8db036cf" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825506 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb55288-e9bb-46f0-bae3-789e8db036cf" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825517 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825524 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825534 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119c35de-5e7a-4d3f-af8a-3595d7dc69aa" containerName="ovn-config" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825539 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="119c35de-5e7a-4d3f-af8a-3595d7dc69aa" containerName="ovn-config" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825549 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825555 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825569 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3aedfe4-2bbb-46c9-97d4-8d6782c44707" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825575 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3aedfe4-2bbb-46c9-97d4-8d6782c44707" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825587 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cafd294d-e929-4cd5-8be3-7175ad4aed09" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825593 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cafd294d-e929-4cd5-8be3-7175ad4aed09" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825603 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8806487-486f-464d-8249-b6368daabff5" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825609 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8806487-486f-464d-8249-b6368daabff5" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825620 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388be198-b438-4142-8fb8-ec9831e9a1af" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827669 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="388be198-b438-4142-8fb8-ec9831e9a1af" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.827683 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24adb929-f812-4243-94ea-23345856d28f" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827691 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="24adb929-f812-4243-94ea-23345856d28f" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827898 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6344c1fe-eecb-4d57-a5c7-a857e4466439" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827914 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cafd294d-e929-4cd5-8be3-7175ad4aed09" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827935 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827945 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="24adb929-f812-4243-94ea-23345856d28f" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827956 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8806487-486f-464d-8249-b6368daabff5" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827967 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3aedfe4-2bbb-46c9-97d4-8d6782c44707" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827976 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827992 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eb55288-e9bb-46f0-bae3-789e8db036cf" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.828001 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="388be198-b438-4142-8fb8-ec9831e9a1af" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.828011 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="119c35de-5e7a-4d3f-af8a-3595d7dc69aa" containerName="ovn-config" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.828596 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.831759 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.843086 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42267\" (UniqueName: \"kubernetes.io/projected/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-kube-api-access-42267\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.843144 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-scripts\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.843216 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-log-ovn\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.843232 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-additional-scripts\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.843256 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.843307 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run-ovn\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.871758 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9zkpb-config-rxxl5"] Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.944596 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run-ovn\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.944893 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run-ovn\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.948316 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42267\" (UniqueName: \"kubernetes.io/projected/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-kube-api-access-42267\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.948429 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-scripts\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.948660 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-additional-scripts\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.948741 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-log-ovn\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.948803 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.949079 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.949188 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-log-ovn\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.949721 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-additional-scripts\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.952742 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-scripts\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.964322 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42267\" (UniqueName: \"kubernetes.io/projected/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-kube-api-access-42267\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:33 crc kubenswrapper[4781]: I0227 00:26:33.024766 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:33 crc kubenswrapper[4781]: I0227 00:26:33.328522 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="119c35de-5e7a-4d3f-af8a-3595d7dc69aa" path="/var/lib/kubelet/pods/119c35de-5e7a-4d3f-af8a-3595d7dc69aa/volumes" Feb 27 00:26:33 crc kubenswrapper[4781]: I0227 00:26:33.586700 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9zkpb-config-rxxl5"] Feb 27 00:26:33 crc kubenswrapper[4781]: W0227 00:26:33.593798 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e5d8eab_34ec_499a_9d69_068f5fb7d9ad.slice/crio-aa283a0b96a963d6521b640c88b47d54a5b742d6d8c749f00a556b9bd66c57bb WatchSource:0}: Error finding container aa283a0b96a963d6521b640c88b47d54a5b742d6d8c749f00a556b9bd66c57bb: Status 404 returned error can't find the container with id aa283a0b96a963d6521b640c88b47d54a5b742d6d8c749f00a556b9bd66c57bb Feb 27 00:26:33 crc kubenswrapper[4781]: I0227 00:26:33.691920 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"67c3b0c45a47fddfb6a0951ff71913409afedbf8d25ee2bb5323d4bbb8b32af1"} Feb 27 00:26:33 crc kubenswrapper[4781]: I0227 00:26:33.691964 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"d12ec22021b20d4ca12664a4251617c4b740ddc1110cc6c583285cc7c7efa3da"} Feb 27 00:26:33 crc kubenswrapper[4781]: I0227 00:26:33.696367 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-rxxl5" event={"ID":"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad","Type":"ContainerStarted","Data":"aa283a0b96a963d6521b640c88b47d54a5b742d6d8c749f00a556b9bd66c57bb"} Feb 27 00:26:34 crc kubenswrapper[4781]: I0227 00:26:34.713918 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"df80b1a2f30812e281285d79ab8c9e2883ed993d5dfaf62c28bd45bbabd723ce"} Feb 27 00:26:34 crc kubenswrapper[4781]: I0227 00:26:34.714537 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"325759443b43a962464bf46dcc5ca5259566a08633a98d0057d55db65af383d2"} Feb 27 00:26:34 crc kubenswrapper[4781]: I0227 00:26:34.718105 4781 generic.go:334] "Generic (PLEG): container finished" podID="47cc3f01-6a5c-4797-bf86-25770e66e928" containerID="e75379ab5c604b926c8da8b4e1bc70d938265b4b81cac412dc92c66988d11e4a" exitCode=0 Feb 27 00:26:34 crc kubenswrapper[4781]: I0227 00:26:34.718205 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8tmft" event={"ID":"47cc3f01-6a5c-4797-bf86-25770e66e928","Type":"ContainerDied","Data":"e75379ab5c604b926c8da8b4e1bc70d938265b4b81cac412dc92c66988d11e4a"} Feb 27 00:26:34 crc kubenswrapper[4781]: I0227 00:26:34.721274 4781 generic.go:334] "Generic (PLEG): container finished" podID="4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" containerID="da1dbeb22d52f0e9e8028b046b421ef782d44fa0719cff0b4421d346eb2fd5aa" exitCode=0 Feb 27 00:26:34 crc kubenswrapper[4781]: I0227 00:26:34.721315 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-rxxl5" event={"ID":"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad","Type":"ContainerDied","Data":"da1dbeb22d52f0e9e8028b046b421ef782d44fa0719cff0b4421d346eb2fd5aa"} Feb 27 00:26:35 crc kubenswrapper[4781]: I0227 00:26:35.735597 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"de3fe0551c7c98ccf5400a775dd2816f8af862a74d043466dc97b6d801196637"} Feb 27 00:26:35 crc kubenswrapper[4781]: I0227 00:26:35.735894 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"5298b3fdc2f8074231037b5c09ecc03350f0efc960213efc6b6884e137f198cf"} Feb 27 00:26:35 crc kubenswrapper[4781]: I0227 00:26:35.742283 4781 generic.go:334] "Generic (PLEG): container finished" podID="58b577a3-c234-4968-a8e7-c5e629de47b1" containerID="69da9fba4081d0816d2a2271ca344a6097bd067857fe6ffab787c65da0531cbc" exitCode=0 Feb 27 00:26:35 crc kubenswrapper[4781]: I0227 00:26:35.742556 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rs9bx" event={"ID":"58b577a3-c234-4968-a8e7-c5e629de47b1","Type":"ContainerDied","Data":"69da9fba4081d0816d2a2271ca344a6097bd067857fe6ffab787c65da0531cbc"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.055166 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.095867 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.099058 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.241318 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run-ovn\") pod \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.241392 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42267\" (UniqueName: \"kubernetes.io/projected/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-kube-api-access-42267\") pod \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.241425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-scripts\") pod \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.241503 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-additional-scripts\") pod \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.241557 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run\") pod \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.241614 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-log-ovn\") pod \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.242408 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" (UID: "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.242922 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" (UID: "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.243111 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run" (OuterVolumeSpecName: "var-run") pod "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" (UID: "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.243152 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" (UID: "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.243975 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-scripts" (OuterVolumeSpecName: "scripts") pod "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" (UID: "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.247814 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-kube-api-access-42267" (OuterVolumeSpecName: "kube-api-access-42267") pod "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" (UID: "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad"). InnerVolumeSpecName "kube-api-access-42267". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.343872 4781 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.344167 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42267\" (UniqueName: \"kubernetes.io/projected/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-kube-api-access-42267\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.344185 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.344197 4781 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.344234 4781 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.344244 4781 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.356323 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.548328 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsrkc\" (UniqueName: \"kubernetes.io/projected/47cc3f01-6a5c-4797-bf86-25770e66e928-kube-api-access-gsrkc\") pod \"47cc3f01-6a5c-4797-bf86-25770e66e928\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.548468 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-config-data\") pod \"47cc3f01-6a5c-4797-bf86-25770e66e928\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.548645 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-db-sync-config-data\") pod \"47cc3f01-6a5c-4797-bf86-25770e66e928\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.548672 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-combined-ca-bundle\") pod \"47cc3f01-6a5c-4797-bf86-25770e66e928\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.558825 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47cc3f01-6a5c-4797-bf86-25770e66e928-kube-api-access-gsrkc" (OuterVolumeSpecName: "kube-api-access-gsrkc") pod "47cc3f01-6a5c-4797-bf86-25770e66e928" (UID: "47cc3f01-6a5c-4797-bf86-25770e66e928"). InnerVolumeSpecName "kube-api-access-gsrkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.559460 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "47cc3f01-6a5c-4797-bf86-25770e66e928" (UID: "47cc3f01-6a5c-4797-bf86-25770e66e928"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.579292 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47cc3f01-6a5c-4797-bf86-25770e66e928" (UID: "47cc3f01-6a5c-4797-bf86-25770e66e928"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.609115 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-config-data" (OuterVolumeSpecName: "config-data") pod "47cc3f01-6a5c-4797-bf86-25770e66e928" (UID: "47cc3f01-6a5c-4797-bf86-25770e66e928"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.650859 4781 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.650976 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.651063 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsrkc\" (UniqueName: \"kubernetes.io/projected/47cc3f01-6a5c-4797-bf86-25770e66e928-kube-api-access-gsrkc\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.651146 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.757660 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-rxxl5" event={"ID":"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad","Type":"ContainerDied","Data":"aa283a0b96a963d6521b640c88b47d54a5b742d6d8c749f00a556b9bd66c57bb"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.757689 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.757723 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa283a0b96a963d6521b640c88b47d54a5b742d6d8c749f00a556b9bd66c57bb" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.764495 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"8f47ac743ea10fdb7ecd5ee7aee1a8be95e4cb690f71f8b316e36580af9decfd"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.764532 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"41227c8db766ee5b37d3103966e5e64ae2d6cc15dd0bd2daecd0a0014c90a62d"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.764546 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"6b44f38c9eaff116f1e2aeb9c56b1882e9c280eb2ad1d24d93d2a2bcf46057d2"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.764559 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"281ce6127d50e533b6ef2c64eeb59c860e5d40b0cbb82b607c6f5880f0452db7"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.767694 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.771453 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8tmft" event={"ID":"47cc3f01-6a5c-4797-bf86-25770e66e928","Type":"ContainerDied","Data":"c43bdd484887a1ab19b1a74ff7e94493b840e9d2b41b9b9e8c3466f0b78cc88d"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.771770 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c43bdd484887a1ab19b1a74ff7e94493b840e9d2b41b9b9e8c3466f0b78cc88d" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.772733 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.041380 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.260339 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9zkpb-config-rxxl5"] Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.284482 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9zkpb-config-rxxl5"] Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.331124 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" path="/var/lib/kubelet/pods/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad/volumes" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.368330 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-jzk92"] Feb 27 00:26:37 crc kubenswrapper[4781]: E0227 00:26:37.369063 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" containerName="ovn-config" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.369077 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" containerName="ovn-config" Feb 27 00:26:37 crc kubenswrapper[4781]: E0227 00:26:37.369117 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47cc3f01-6a5c-4797-bf86-25770e66e928" containerName="glance-db-sync" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.369124 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="47cc3f01-6a5c-4797-bf86-25770e66e928" containerName="glance-db-sync" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.369504 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" containerName="ovn-config" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.369528 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="47cc3f01-6a5c-4797-bf86-25770e66e928" containerName="glance-db-sync" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.373416 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.400010 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-jzk92"] Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.411663 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.488992 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-combined-ca-bundle\") pod \"58b577a3-c234-4968-a8e7-c5e629de47b1\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.489430 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29n8z\" (UniqueName: \"kubernetes.io/projected/58b577a3-c234-4968-a8e7-c5e629de47b1-kube-api-access-29n8z\") pod \"58b577a3-c234-4968-a8e7-c5e629de47b1\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.489568 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-config-data\") pod \"58b577a3-c234-4968-a8e7-c5e629de47b1\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.490039 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-config\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.490110 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.490139 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.490244 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n688\" (UniqueName: \"kubernetes.io/projected/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-kube-api-access-9n688\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.490320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.499118 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58b577a3-c234-4968-a8e7-c5e629de47b1-kube-api-access-29n8z" (OuterVolumeSpecName: "kube-api-access-29n8z") pod "58b577a3-c234-4968-a8e7-c5e629de47b1" (UID: "58b577a3-c234-4968-a8e7-c5e629de47b1"). InnerVolumeSpecName "kube-api-access-29n8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.529502 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58b577a3-c234-4968-a8e7-c5e629de47b1" (UID: "58b577a3-c234-4968-a8e7-c5e629de47b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.569544 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-config-data" (OuterVolumeSpecName: "config-data") pod "58b577a3-c234-4968-a8e7-c5e629de47b1" (UID: "58b577a3-c234-4968-a8e7-c5e629de47b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592064 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592109 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592192 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n688\" (UniqueName: \"kubernetes.io/projected/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-kube-api-access-9n688\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592246 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592274 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-config\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592526 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592538 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29n8z\" (UniqueName: \"kubernetes.io/projected/58b577a3-c234-4968-a8e7-c5e629de47b1-kube-api-access-29n8z\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592549 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.593261 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-config\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.593800 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.595858 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.596048 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.619818 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n688\" (UniqueName: \"kubernetes.io/projected/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-kube-api-access-9n688\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.718108 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.801017 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rs9bx" event={"ID":"58b577a3-c234-4968-a8e7-c5e629de47b1","Type":"ContainerDied","Data":"9adcda9f33abe1db2735efdad7642538c67bab67f10201630c502ea9fc7b9c52"} Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.801046 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.801062 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9adcda9f33abe1db2735efdad7642538c67bab67f10201630c502ea9fc7b9c52" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.812291 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"7da95ec092aa8d03f82818fa419cc458bcf6c5915f99840320539283be886091"} Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.862745 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=43.22446746 podStartE2EDuration="52.862726298s" podCreationTimestamp="2026-02-27 00:25:45 +0000 UTC" firstStartedPulling="2026-02-27 00:26:25.48590174 +0000 UTC m=+1254.743441304" lastFinishedPulling="2026-02-27 00:26:35.124160588 +0000 UTC m=+1264.381700142" observedRunningTime="2026-02-27 00:26:37.847752442 +0000 UTC m=+1267.105292016" watchObservedRunningTime="2026-02-27 00:26:37.862726298 +0000 UTC m=+1267.120265852" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.042900 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-jzk92"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.069595 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fwppv"] Feb 27 00:26:38 crc kubenswrapper[4781]: E0227 00:26:38.069989 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b577a3-c234-4968-a8e7-c5e629de47b1" containerName="keystone-db-sync" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.070007 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b577a3-c234-4968-a8e7-c5e629de47b1" containerName="keystone-db-sync" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.070220 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="58b577a3-c234-4968-a8e7-c5e629de47b1" containerName="keystone-db-sync" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.070876 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.078488 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nhgp" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.078690 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.079061 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.079191 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.079762 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.098682 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fwppv"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.104742 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-scripts\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.104853 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-config-data\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.104909 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdcl6\" (UniqueName: \"kubernetes.io/projected/75b432e5-2a1d-421d-ac63-202bbe4be5c5-kube-api-access-pdcl6\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.104963 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-credential-keys\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.104991 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-combined-ca-bundle\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.105042 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-fernet-keys\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.114681 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-784f69c749-m2gmj"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.116642 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.145249 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-m2gmj"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.210020 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-fernet-keys\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.210105 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-scripts\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.210147 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrnxf\" (UniqueName: \"kubernetes.io/projected/8be86e27-4a35-4929-92d1-bfcd0ce641a8-kube-api-access-nrnxf\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.210183 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-config\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.210222 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-config-data\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.211346 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.211433 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.211514 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdcl6\" (UniqueName: \"kubernetes.io/projected/75b432e5-2a1d-421d-ac63-202bbe4be5c5-kube-api-access-pdcl6\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.211650 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-credential-keys\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.211740 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-combined-ca-bundle\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.211822 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-dns-svc\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.229441 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-credential-keys\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.231141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-combined-ca-bundle\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.246146 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-config-data\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.247065 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-scripts\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.249978 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-fernet-keys\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.260954 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdcl6\" (UniqueName: \"kubernetes.io/projected/75b432e5-2a1d-421d-ac63-202bbe4be5c5-kube-api-access-pdcl6\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.314655 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-dns-svc\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.314763 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrnxf\" (UniqueName: \"kubernetes.io/projected/8be86e27-4a35-4929-92d1-bfcd0ce641a8-kube-api-access-nrnxf\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.314801 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-config\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.314848 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.314868 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.315747 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.316272 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-dns-svc\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.317026 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-config\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.317534 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.361187 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrnxf\" (UniqueName: \"kubernetes.io/projected/8be86e27-4a35-4929-92d1-bfcd0ce641a8-kube-api-access-nrnxf\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.389409 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.433938 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-jzk92"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.462767 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9vlp4"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.483479 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.492505 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-m2gmj"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.492922 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.497530 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5hsdr" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.497751 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.497863 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.517930 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-combined-ca-bundle\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.517991 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-scripts\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.518019 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkqc\" (UniqueName: \"kubernetes.io/projected/aef65495-ecb2-4396-bb05-a4c5ee48f291-kube-api-access-tvkqc\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.518067 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-db-sync-config-data\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.518089 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aef65495-ecb2-4396-bb05-a4c5ee48f291-etc-machine-id\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.518139 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-config-data\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.522506 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9vlp4"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.595647 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-82c69"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.597232 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.602890 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622541 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-config\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622603 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-db-sync-config-data\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622638 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aef65495-ecb2-4396-bb05-a4c5ee48f291-etc-machine-id\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622657 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622715 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-config-data\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622756 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622776 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bst95\" (UniqueName: \"kubernetes.io/projected/9673a51c-390f-4e38-ae85-e5c3e1eaa816-kube-api-access-bst95\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622809 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-combined-ca-bundle\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622832 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-svc\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622857 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-scripts\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622875 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvkqc\" (UniqueName: \"kubernetes.io/projected/aef65495-ecb2-4396-bb05-a4c5ee48f291-kube-api-access-tvkqc\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622913 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.623213 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aef65495-ecb2-4396-bb05-a4c5ee48f291-etc-machine-id\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.632336 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-82c69"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.637852 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-bk54r"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.645310 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-combined-ca-bundle\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.645686 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-db-sync-config-data\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.663409 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-bf4zw"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.664159 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bk54r"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.664236 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.664687 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.671516 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.671882 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.672099 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.672185 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-d4ppr" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.672264 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2j295" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.676146 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-config-data\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.678250 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvkqc\" (UniqueName: \"kubernetes.io/projected/aef65495-ecb2-4396-bb05-a4c5ee48f291-kube-api-access-tvkqc\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.678912 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bf4zw"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.683833 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-scripts\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.730238 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.730348 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bst95\" (UniqueName: \"kubernetes.io/projected/9673a51c-390f-4e38-ae85-e5c3e1eaa816-kube-api-access-bst95\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.730436 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-svc\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.730540 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.730585 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-config\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.730651 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.731875 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-svc\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.732859 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.733500 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-config\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.766532 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.766662 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.786655 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.794861 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.817988 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.818420 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.819293 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bst95\" (UniqueName: \"kubernetes.io/projected/9673a51c-390f-4e38-ae85-e5c3e1eaa816-kube-api-access-bst95\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.833485 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-combined-ca-bundle\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.833561 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzkc9\" (UniqueName: \"kubernetes.io/projected/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-kube-api-access-lzkc9\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.840236 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-config\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.842702 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-db-sync-config-data\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.851225 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-combined-ca-bundle\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.851322 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmp94\" (UniqueName: \"kubernetes.io/projected/314ca901-3264-4136-b377-daad0075b72c-kube-api-access-lmp94\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.853286 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-82c69"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.855273 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.857125 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.860524 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" event={"ID":"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0","Type":"ContainerStarted","Data":"8ff5a2fc17a6b9e8aabae8319eed0080a626c32c0d363e4d215380fca5df2f7c"} Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.919075 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.946675 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5d6jk"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.948723 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957285 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-config-data\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957358 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-log-httpd\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957414 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-db-sync-config-data\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957474 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-run-httpd\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957535 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-combined-ca-bundle\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957566 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957647 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmp94\" (UniqueName: \"kubernetes.io/projected/314ca901-3264-4136-b377-daad0075b72c-kube-api-access-lmp94\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957734 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-combined-ca-bundle\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957807 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-scripts\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957841 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzkc9\" (UniqueName: \"kubernetes.io/projected/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-kube-api-access-lzkc9\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957926 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qwfg\" (UniqueName: \"kubernetes.io/projected/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-kube-api-access-9qwfg\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957985 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.958027 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-config\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.968195 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-db-sync-config-data\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.971267 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-l9w6z"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.971685 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-combined-ca-bundle\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.987293 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-config\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.987481 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-combined-ca-bundle\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.006153 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzkc9\" (UniqueName: \"kubernetes.io/projected/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-kube-api-access-lzkc9\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.007329 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.010311 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-qt68h" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.011158 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.011502 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.011568 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.018272 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmp94\" (UniqueName: \"kubernetes.io/projected/314ca901-3264-4136-b377-daad0075b72c-kube-api-access-lmp94\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.018348 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-l9w6z"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.036366 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jqsnp"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.037670 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.042806 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.044838 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7kxfw" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.045072 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.045324 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.053190 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059642 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qwfg\" (UniqueName: \"kubernetes.io/projected/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-kube-api-access-9qwfg\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059692 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059744 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059776 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-config-data\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059799 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-log-httpd\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059833 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxl4f\" (UniqueName: \"kubernetes.io/projected/555d083f-48ec-4cf2-922f-211c99af51be-kube-api-access-hxl4f\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059863 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-run-httpd\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059882 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-config\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059906 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059925 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059948 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.060034 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.060058 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-scripts\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.068815 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5d6jk"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.069311 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-run-httpd\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.069553 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-log-httpd\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.095282 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qwfg\" (UniqueName: \"kubernetes.io/projected/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-kube-api-access-9qwfg\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.100930 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-scripts\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.105162 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-config-data\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.107399 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jqsnp"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.107967 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.115289 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.161939 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3fa4251-dd48-417b-8002-6df02d3d3dac-logs\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162272 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162375 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-config-data\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162479 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxl4f\" (UniqueName: \"kubernetes.io/projected/555d083f-48ec-4cf2-922f-211c99af51be-kube-api-access-hxl4f\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162562 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-config-data\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162669 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-config\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162770 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162863 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-combined-ca-bundle\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162936 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.163010 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-certs\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.163093 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwdbp\" (UniqueName: \"kubernetes.io/projected/a3fa4251-dd48-417b-8002-6df02d3d3dac-kube-api-access-pwdbp\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.165109 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-scripts\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.165204 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-scripts\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.165303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-combined-ca-bundle\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.165393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwsv7\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-kube-api-access-bwsv7\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.165460 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.170266 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.179715 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.184354 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-config\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.186274 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.186977 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.187609 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.199209 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxl4f\" (UniqueName: \"kubernetes.io/projected/555d083f-48ec-4cf2-922f-211c99af51be-kube-api-access-hxl4f\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.224434 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.243477 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.245305 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.249293 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.257115 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.257297 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4ql2s" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.257412 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271099 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3fa4251-dd48-417b-8002-6df02d3d3dac-logs\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271175 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-config-data\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271235 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-config-data\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271293 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-combined-ca-bundle\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271323 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-certs\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271376 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwdbp\" (UniqueName: \"kubernetes.io/projected/a3fa4251-dd48-417b-8002-6df02d3d3dac-kube-api-access-pwdbp\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271423 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-scripts\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271448 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-scripts\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271481 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-combined-ca-bundle\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271512 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwsv7\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-kube-api-access-bwsv7\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.272504 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3fa4251-dd48-417b-8002-6df02d3d3dac-logs\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.285460 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-config-data\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.286359 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-config-data\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.304280 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwsv7\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-kube-api-access-bwsv7\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.310497 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-combined-ca-bundle\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.319131 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-combined-ca-bundle\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.333936 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-certs\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.355039 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-scripts\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.357034 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-scripts\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.372158 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwdbp\" (UniqueName: \"kubernetes.io/projected/a3fa4251-dd48-417b-8002-6df02d3d3dac-kube-api-access-pwdbp\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384199 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-config-data\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384273 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-scripts\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384298 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-logs\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384337 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384371 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384404 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384466 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxqvq\" (UniqueName: \"kubernetes.io/projected/1055fd61-f323-4cc6-8109-5096add1af65-kube-api-access-hxqvq\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.387167 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.389204 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.393850 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.405354 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494048 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-scripts\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494101 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-logs\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494155 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494188 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494215 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494260 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqvq\" (UniqueName: \"kubernetes.io/projected/1055fd61-f323-4cc6-8109-5096add1af65-kube-api-access-hxqvq\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494344 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-config-data\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.495168 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.495409 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-logs\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.500997 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.501035 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5d3045414bd1cd74ec61e0394ba262493610c57a87bbc940ef275e8fc1bc2ecf/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.501703 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-scripts\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.504358 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.507453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-config-data\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.532356 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxqvq\" (UniqueName: \"kubernetes.io/projected/1055fd61-f323-4cc6-8109-5096add1af65-kube-api-access-hxqvq\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595684 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595741 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595782 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-logs\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595810 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595874 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595928 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595950 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt497\" (UniqueName: \"kubernetes.io/projected/ac31f36e-35c4-4f48-a05b-f49855052358-kube-api-access-wt497\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.617414 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.698328 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.698799 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.698879 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-logs\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.699370 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.699454 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-logs\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.699464 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.699667 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.699697 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt497\" (UniqueName: \"kubernetes.io/projected/ac31f36e-35c4-4f48-a05b-f49855052358-kube-api-access-wt497\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.700321 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.703317 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.703351 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a7b96405e17327882846f95b5adf8b290f3f24e0a3e5cf6d272cf20133e6cae4/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.705000 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.705443 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.707602 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.721360 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt497\" (UniqueName: \"kubernetes.io/projected/ac31f36e-35c4-4f48-a05b-f49855052358-kube-api-access-wt497\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.743160 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.783315 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.784280 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.847579 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.852461 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.865707 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-m2gmj"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.908853 4781 generic.go:334] "Generic (PLEG): container finished" podID="8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" containerID="f941532db51eb7d8f322beb83db6f8252752ff4cb56b4df1b09edb6a1f01a13c" exitCode=0 Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.908917 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" event={"ID":"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0","Type":"ContainerDied","Data":"f941532db51eb7d8f322beb83db6f8252752ff4cb56b4df1b09edb6a1f01a13c"} Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.925971 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fwppv"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.049419 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-82c69"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.485824 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5d6jk"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.494799 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9vlp4"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.542700 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bk54r"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.559958 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bf4zw"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.774283 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-l9w6z"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.781366 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.790771 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.859282 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-config\") pod \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.859389 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n688\" (UniqueName: \"kubernetes.io/projected/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-kube-api-access-9n688\") pod \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.859479 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-nb\") pod \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.859522 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-dns-svc\") pod \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.859652 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-sb\") pod \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.868018 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-kube-api-access-9n688" (OuterVolumeSpecName: "kube-api-access-9n688") pod "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" (UID: "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0"). InnerVolumeSpecName "kube-api-access-9n688". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.887048 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" (UID: "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.908796 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" (UID: "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.909430 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" (UID: "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.925464 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-config" (OuterVolumeSpecName: "config") pod "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" (UID: "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.930392 4781 generic.go:334] "Generic (PLEG): container finished" podID="8be86e27-4a35-4929-92d1-bfcd0ce641a8" containerID="0f7d06bdf37788cb17229ac4656b5472f2e83b9e00dc3b57906aa80af5c573a4" exitCode=0 Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.930453 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" event={"ID":"8be86e27-4a35-4929-92d1-bfcd0ce641a8","Type":"ContainerDied","Data":"0f7d06bdf37788cb17229ac4656b5472f2e83b9e00dc3b57906aa80af5c573a4"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.930479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" event={"ID":"8be86e27-4a35-4929-92d1-bfcd0ce641a8","Type":"ContainerStarted","Data":"4ad7a1f1a6be4d60bb1c31fde9beac294118a7abd8ca340a5f209fda9ba451e6"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.933000 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerStarted","Data":"8523e5974cb6fe577a148d4d77627c86ea1298c44ff6fdd8db602516c249b5d9"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.934178 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bf4zw" event={"ID":"314ca901-3264-4136-b377-daad0075b72c","Type":"ContainerStarted","Data":"0dfa44d37d2f64ae96d38dcaa27616ed0a623f908ad45d0066876fbf98be36ee"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.936401 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bk54r" event={"ID":"3f43ab5c-f862-468c-92c1-ec7366eb7ed0","Type":"ContainerStarted","Data":"9abe8ef3a48995708f20de72923495db036e6761eb107a6dfc8ea5dccc96bf58"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.936446 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bk54r" event={"ID":"3f43ab5c-f862-468c-92c1-ec7366eb7ed0","Type":"ContainerStarted","Data":"f4907a514c133717f2dd463877fdc9d6b4b6535ee45f8865a1b93ba48242fe73"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.943366 4781 generic.go:334] "Generic (PLEG): container finished" podID="9673a51c-390f-4e38-ae85-e5c3e1eaa816" containerID="517ead0b52aa65ce0d6fa994a34b320f55d9362dc9c894d32144d2082b233fb4" exitCode=0 Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.943427 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-82c69" event={"ID":"9673a51c-390f-4e38-ae85-e5c3e1eaa816","Type":"ContainerDied","Data":"517ead0b52aa65ce0d6fa994a34b320f55d9362dc9c894d32144d2082b233fb4"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.943455 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-82c69" event={"ID":"9673a51c-390f-4e38-ae85-e5c3e1eaa816","Type":"ContainerStarted","Data":"be280fe4c5dcd4daf8e49a0f28102aac1f55d36e2411ff42f6fe40cde84b1918"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.948756 4781 generic.go:334] "Generic (PLEG): container finished" podID="555d083f-48ec-4cf2-922f-211c99af51be" containerID="bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15" exitCode=0 Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.948804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" event={"ID":"555d083f-48ec-4cf2-922f-211c99af51be","Type":"ContainerDied","Data":"bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.948823 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" event={"ID":"555d083f-48ec-4cf2-922f-211c99af51be","Type":"ContainerStarted","Data":"e585d85b515ebdb2e3ddbc1e6c665f7d63c4b4ae71b72a576899c9774906e6b5"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.965367 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n688\" (UniqueName: \"kubernetes.io/projected/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-kube-api-access-9n688\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.965655 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.965668 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.965675 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.965685 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.971931 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-l9w6z" event={"ID":"2274af64-0743-4ede-8fb8-e2ed801638ac","Type":"ContainerStarted","Data":"6d62d5f9e32bc3adf9e5c830b2c7fb23773647380ed0a769526c60e85872b03f"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.982988 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" event={"ID":"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0","Type":"ContainerDied","Data":"8ff5a2fc17a6b9e8aabae8319eed0080a626c32c0d363e4d215380fca5df2f7c"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.983011 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.983064 4781 scope.go:117] "RemoveContainer" containerID="f941532db51eb7d8f322beb83db6f8252752ff4cb56b4df1b09edb6a1f01a13c" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.991378 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fwppv" event={"ID":"75b432e5-2a1d-421d-ac63-202bbe4be5c5","Type":"ContainerStarted","Data":"914d10b311f6e761cfe3376de0d9169e16d04822bd5c0495a9b64cbbe456b1f4"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.991429 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fwppv" event={"ID":"75b432e5-2a1d-421d-ac63-202bbe4be5c5","Type":"ContainerStarted","Data":"7610fc60d1158180f6a0fcb6c59bb1930e7a3d6fd2c319da87c248ff0413eb39"} Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.026926 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9vlp4" event={"ID":"aef65495-ecb2-4396-bb05-a4c5ee48f291","Type":"ContainerStarted","Data":"77049757ad8c9d1e53f2546542f34ddf95b52b836b4034f26af7417bb129d6d8"} Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.050460 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-bk54r" podStartSLOduration=3.050445003 podStartE2EDuration="3.050445003s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:41.048174823 +0000 UTC m=+1270.305714377" watchObservedRunningTime="2026-02-27 00:26:41.050445003 +0000 UTC m=+1270.307984557" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.118348 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jqsnp"] Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.119748 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fwppv" podStartSLOduration=3.119732473 podStartE2EDuration="3.119732473s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:41.099298173 +0000 UTC m=+1270.356837737" watchObservedRunningTime="2026-02-27 00:26:41.119732473 +0000 UTC m=+1270.377272027" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.176543 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-jzk92"] Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.191109 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-jzk92"] Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.211830 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:41 crc kubenswrapper[4781]: W0227 00:26:41.236118 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1055fd61_f323_4cc6_8109_5096add1af65.slice/crio-0a4dbcea2445e2dfe290d51e82f770fb1c0297a9dd9fce44fbc8991c778b4137 WatchSource:0}: Error finding container 0a4dbcea2445e2dfe290d51e82f770fb1c0297a9dd9fce44fbc8991c778b4137: Status 404 returned error can't find the container with id 0a4dbcea2445e2dfe290d51e82f770fb1c0297a9dd9fce44fbc8991c778b4137 Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.474758 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" path="/var/lib/kubelet/pods/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0/volumes" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.663313 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.692809 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-config\") pod \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.693234 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrnxf\" (UniqueName: \"kubernetes.io/projected/8be86e27-4a35-4929-92d1-bfcd0ce641a8-kube-api-access-nrnxf\") pod \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.703819 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be86e27-4a35-4929-92d1-bfcd0ce641a8-kube-api-access-nrnxf" (OuterVolumeSpecName: "kube-api-access-nrnxf") pod "8be86e27-4a35-4929-92d1-bfcd0ce641a8" (UID: "8be86e27-4a35-4929-92d1-bfcd0ce641a8"). InnerVolumeSpecName "kube-api-access-nrnxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.743470 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-config" (OuterVolumeSpecName: "config") pod "8be86e27-4a35-4929-92d1-bfcd0ce641a8" (UID: "8be86e27-4a35-4929-92d1-bfcd0ce641a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.764042 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800287 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-nb\") pod \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800391 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-swift-storage-0\") pod \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-svc\") pod \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800487 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bst95\" (UniqueName: \"kubernetes.io/projected/9673a51c-390f-4e38-ae85-e5c3e1eaa816-kube-api-access-bst95\") pod \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800534 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-sb\") pod \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800554 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-sb\") pod \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800575 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-config\") pod \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800711 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-dns-svc\") pod \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800761 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-nb\") pod \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.801310 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.801339 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrnxf\" (UniqueName: \"kubernetes.io/projected/8be86e27-4a35-4929-92d1-bfcd0ce641a8-kube-api-access-nrnxf\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.830415 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9673a51c-390f-4e38-ae85-e5c3e1eaa816-kube-api-access-bst95" (OuterVolumeSpecName: "kube-api-access-bst95") pod "9673a51c-390f-4e38-ae85-e5c3e1eaa816" (UID: "9673a51c-390f-4e38-ae85-e5c3e1eaa816"). InnerVolumeSpecName "kube-api-access-bst95". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.910424 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bst95\" (UniqueName: \"kubernetes.io/projected/9673a51c-390f-4e38-ae85-e5c3e1eaa816-kube-api-access-bst95\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.058826 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1055fd61-f323-4cc6-8109-5096add1af65","Type":"ContainerStarted","Data":"0a4dbcea2445e2dfe290d51e82f770fb1c0297a9dd9fce44fbc8991c778b4137"} Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.084495 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8be86e27-4a35-4929-92d1-bfcd0ce641a8" (UID: "8be86e27-4a35-4929-92d1-bfcd0ce641a8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.089500 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9673a51c-390f-4e38-ae85-e5c3e1eaa816" (UID: "9673a51c-390f-4e38-ae85-e5c3e1eaa816"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.089748 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-82c69" event={"ID":"9673a51c-390f-4e38-ae85-e5c3e1eaa816","Type":"ContainerDied","Data":"be280fe4c5dcd4daf8e49a0f28102aac1f55d36e2411ff42f6fe40cde84b1918"} Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.089877 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.089900 4781 scope.go:117] "RemoveContainer" containerID="517ead0b52aa65ce0d6fa994a34b320f55d9362dc9c894d32144d2082b233fb4" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.103941 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jqsnp" event={"ID":"a3fa4251-dd48-417b-8002-6df02d3d3dac","Type":"ContainerStarted","Data":"6da65166fa2a15c764d849696d3e6b0686802ef8180c50248f4b03677850887a"} Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.112059 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.114190 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" event={"ID":"8be86e27-4a35-4929-92d1-bfcd0ce641a8","Type":"ContainerDied","Data":"4ad7a1f1a6be4d60bb1c31fde9beac294118a7abd8ca340a5f209fda9ba451e6"} Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.114439 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.116135 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.116277 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.118299 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9673a51c-390f-4e38-ae85-e5c3e1eaa816" (UID: "9673a51c-390f-4e38-ae85-e5c3e1eaa816"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: W0227 00:26:42.120999 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac31f36e_35c4_4f48_a05b_f49855052358.slice/crio-084b441c862a06da9a9c0b4c64f7175017dfb66c7fcb6d7f1ae1871790335c15 WatchSource:0}: Error finding container 084b441c862a06da9a9c0b4c64f7175017dfb66c7fcb6d7f1ae1871790335c15: Status 404 returned error can't find the container with id 084b441c862a06da9a9c0b4c64f7175017dfb66c7fcb6d7f1ae1871790335c15 Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.138160 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8be86e27-4a35-4929-92d1-bfcd0ce641a8" (UID: "8be86e27-4a35-4929-92d1-bfcd0ce641a8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.141766 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-config" (OuterVolumeSpecName: "config") pod "9673a51c-390f-4e38-ae85-e5c3e1eaa816" (UID: "9673a51c-390f-4e38-ae85-e5c3e1eaa816"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.141876 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9673a51c-390f-4e38-ae85-e5c3e1eaa816" (UID: "9673a51c-390f-4e38-ae85-e5c3e1eaa816"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.144500 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8be86e27-4a35-4929-92d1-bfcd0ce641a8" (UID: "8be86e27-4a35-4929-92d1-bfcd0ce641a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.147097 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9673a51c-390f-4e38-ae85-e5c3e1eaa816" (UID: "9673a51c-390f-4e38-ae85-e5c3e1eaa816"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.199353 4781 scope.go:117] "RemoveContainer" containerID="0f7d06bdf37788cb17229ac4656b5472f2e83b9e00dc3b57906aa80af5c573a4" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.229329 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.229361 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.229371 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.229380 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.229388 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.229396 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.541036 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-82c69"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.558789 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-82c69"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.592443 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-m2gmj"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.614903 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-m2gmj"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.892757 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.895788 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.895833 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.951541 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.977641 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.137694 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ac31f36e-35c4-4f48-a05b-f49855052358","Type":"ContainerStarted","Data":"084b441c862a06da9a9c0b4c64f7175017dfb66c7fcb6d7f1ae1871790335c15"} Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.140326 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" event={"ID":"555d083f-48ec-4cf2-922f-211c99af51be","Type":"ContainerStarted","Data":"4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2"} Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.141892 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.152093 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1055fd61-f323-4cc6-8109-5096add1af65","Type":"ContainerStarted","Data":"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318"} Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.207205 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" podStartSLOduration=5.207184671 podStartE2EDuration="5.207184671s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:43.180895086 +0000 UTC m=+1272.438434660" watchObservedRunningTime="2026-02-27 00:26:43.207184671 +0000 UTC m=+1272.464724235" Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.343173 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8be86e27-4a35-4929-92d1-bfcd0ce641a8" path="/var/lib/kubelet/pods/8be86e27-4a35-4929-92d1-bfcd0ce641a8/volumes" Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.343886 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9673a51c-390f-4e38-ae85-e5c3e1eaa816" path="/var/lib/kubelet/pods/9673a51c-390f-4e38-ae85-e5c3e1eaa816/volumes" Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.181120 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ac31f36e-35c4-4f48-a05b-f49855052358","Type":"ContainerStarted","Data":"af130dc6503472cd229a6073407a944eb9f345fe510e3a5882815f2cc79c8dc8"} Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.185502 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1055fd61-f323-4cc6-8109-5096add1af65","Type":"ContainerStarted","Data":"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e"} Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.185702 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-log" containerID="cri-o://a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318" gracePeriod=30 Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.185783 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-httpd" containerID="cri-o://4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e" gracePeriod=30 Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.211048 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.211029321 podStartE2EDuration="6.211029321s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:44.206576573 +0000 UTC m=+1273.464116117" watchObservedRunningTime="2026-02-27 00:26:44.211029321 +0000 UTC m=+1273.468568875" Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.928988 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994570 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxqvq\" (UniqueName: \"kubernetes.io/projected/1055fd61-f323-4cc6-8109-5096add1af65-kube-api-access-hxqvq\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994648 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-combined-ca-bundle\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994682 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-logs\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994711 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-scripts\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994799 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-config-data\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994834 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-httpd-run\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994940 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.995668 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.995971 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-logs" (OuterVolumeSpecName: "logs") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.009300 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1055fd61-f323-4cc6-8109-5096add1af65-kube-api-access-hxqvq" (OuterVolumeSpecName: "kube-api-access-hxqvq") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "kube-api-access-hxqvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.016846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-scripts" (OuterVolumeSpecName: "scripts") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.034478 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b" (OuterVolumeSpecName: "glance") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.059725 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.098932 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxqvq\" (UniqueName: \"kubernetes.io/projected/1055fd61-f323-4cc6-8109-5096add1af65-kube-api-access-hxqvq\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.098964 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.098973 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.098983 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.098993 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.099026 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") on node \"crc\" " Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.106923 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-config-data" (OuterVolumeSpecName: "config-data") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.127340 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.127516 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b") on node "crc" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.201710 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.201771 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.206451 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ac31f36e-35c4-4f48-a05b-f49855052358","Type":"ContainerStarted","Data":"2a9c0dd16ab1b03b070571c17e79b75425373239b1c45b1c5b15a3b9a4d8f4b1"} Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.206567 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-log" containerID="cri-o://af130dc6503472cd229a6073407a944eb9f345fe510e3a5882815f2cc79c8dc8" gracePeriod=30 Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.206669 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-httpd" containerID="cri-o://2a9c0dd16ab1b03b070571c17e79b75425373239b1c45b1c5b15a3b9a4d8f4b1" gracePeriod=30 Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211311 4781 generic.go:334] "Generic (PLEG): container finished" podID="1055fd61-f323-4cc6-8109-5096add1af65" containerID="4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e" exitCode=143 Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211343 4781 generic.go:334] "Generic (PLEG): container finished" podID="1055fd61-f323-4cc6-8109-5096add1af65" containerID="a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318" exitCode=143 Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211386 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211423 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1055fd61-f323-4cc6-8109-5096add1af65","Type":"ContainerDied","Data":"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e"} Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211449 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1055fd61-f323-4cc6-8109-5096add1af65","Type":"ContainerDied","Data":"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318"} Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211459 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1055fd61-f323-4cc6-8109-5096add1af65","Type":"ContainerDied","Data":"0a4dbcea2445e2dfe290d51e82f770fb1c0297a9dd9fce44fbc8991c778b4137"} Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211473 4781 scope.go:117] "RemoveContainer" containerID="4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.242453 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.242427699 podStartE2EDuration="7.242427699s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:45.233030141 +0000 UTC m=+1274.490569695" watchObservedRunningTime="2026-02-27 00:26:45.242427699 +0000 UTC m=+1274.499967253" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.299558 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.306857 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.348150 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1055fd61-f323-4cc6-8109-5096add1af65" path="/var/lib/kubelet/pods/1055fd61-f323-4cc6-8109-5096add1af65/volumes" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.355768 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:45 crc kubenswrapper[4781]: E0227 00:26:45.356317 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be86e27-4a35-4929-92d1-bfcd0ce641a8" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356343 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be86e27-4a35-4929-92d1-bfcd0ce641a8" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: E0227 00:26:45.356364 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9673a51c-390f-4e38-ae85-e5c3e1eaa816" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356375 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9673a51c-390f-4e38-ae85-e5c3e1eaa816" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: E0227 00:26:45.356406 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356414 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: E0227 00:26:45.356437 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-httpd" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356446 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-httpd" Feb 27 00:26:45 crc kubenswrapper[4781]: E0227 00:26:45.356458 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-log" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356466 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-log" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356707 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be86e27-4a35-4929-92d1-bfcd0ce641a8" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356744 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-httpd" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356764 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356781 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9673a51c-390f-4e38-ae85-e5c3e1eaa816" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356794 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-log" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.358544 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.361993 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.380335 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507288 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-config-data\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507360 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/fe8b7774-c640-416d-82a4-535fee88a47b-kube-api-access-8vxcl\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507399 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507474 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507494 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-logs\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507540 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507557 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-scripts\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609386 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609431 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-logs\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609498 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609531 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-scripts\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609572 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-config-data\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609723 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/fe8b7774-c640-416d-82a4-535fee88a47b-kube-api-access-8vxcl\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609773 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.610674 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.610757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-logs\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.614512 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-scripts\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.615317 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.615697 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-config-data\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.616654 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.616821 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5d3045414bd1cd74ec61e0394ba262493610c57a87bbc940ef275e8fc1bc2ecf/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.638540 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/fe8b7774-c640-416d-82a4-535fee88a47b-kube-api-access-8vxcl\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.710958 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.984979 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:26:46 crc kubenswrapper[4781]: I0227 00:26:46.228116 4781 generic.go:334] "Generic (PLEG): container finished" podID="75b432e5-2a1d-421d-ac63-202bbe4be5c5" containerID="914d10b311f6e761cfe3376de0d9169e16d04822bd5c0495a9b64cbbe456b1f4" exitCode=0 Feb 27 00:26:46 crc kubenswrapper[4781]: I0227 00:26:46.228209 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fwppv" event={"ID":"75b432e5-2a1d-421d-ac63-202bbe4be5c5","Type":"ContainerDied","Data":"914d10b311f6e761cfe3376de0d9169e16d04822bd5c0495a9b64cbbe456b1f4"} Feb 27 00:26:46 crc kubenswrapper[4781]: I0227 00:26:46.233849 4781 generic.go:334] "Generic (PLEG): container finished" podID="ac31f36e-35c4-4f48-a05b-f49855052358" containerID="2a9c0dd16ab1b03b070571c17e79b75425373239b1c45b1c5b15a3b9a4d8f4b1" exitCode=0 Feb 27 00:26:46 crc kubenswrapper[4781]: I0227 00:26:46.233893 4781 generic.go:334] "Generic (PLEG): container finished" podID="ac31f36e-35c4-4f48-a05b-f49855052358" containerID="af130dc6503472cd229a6073407a944eb9f345fe510e3a5882815f2cc79c8dc8" exitCode=143 Feb 27 00:26:46 crc kubenswrapper[4781]: I0227 00:26:46.233926 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ac31f36e-35c4-4f48-a05b-f49855052358","Type":"ContainerDied","Data":"2a9c0dd16ab1b03b070571c17e79b75425373239b1c45b1c5b15a3b9a4d8f4b1"} Feb 27 00:26:46 crc kubenswrapper[4781]: I0227 00:26:46.233989 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ac31f36e-35c4-4f48-a05b-f49855052358","Type":"ContainerDied","Data":"af130dc6503472cd229a6073407a944eb9f345fe510e3a5882815f2cc79c8dc8"} Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.016879 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.060299 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-combined-ca-bundle\") pod \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.060368 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-config-data\") pod \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.060428 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-fernet-keys\") pod \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.060456 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-scripts\") pod \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.060492 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-credential-keys\") pod \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.060570 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdcl6\" (UniqueName: \"kubernetes.io/projected/75b432e5-2a1d-421d-ac63-202bbe4be5c5-kube-api-access-pdcl6\") pod \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.069478 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "75b432e5-2a1d-421d-ac63-202bbe4be5c5" (UID: "75b432e5-2a1d-421d-ac63-202bbe4be5c5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.069667 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75b432e5-2a1d-421d-ac63-202bbe4be5c5-kube-api-access-pdcl6" (OuterVolumeSpecName: "kube-api-access-pdcl6") pod "75b432e5-2a1d-421d-ac63-202bbe4be5c5" (UID: "75b432e5-2a1d-421d-ac63-202bbe4be5c5"). InnerVolumeSpecName "kube-api-access-pdcl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.083892 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "75b432e5-2a1d-421d-ac63-202bbe4be5c5" (UID: "75b432e5-2a1d-421d-ac63-202bbe4be5c5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.083981 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-scripts" (OuterVolumeSpecName: "scripts") pod "75b432e5-2a1d-421d-ac63-202bbe4be5c5" (UID: "75b432e5-2a1d-421d-ac63-202bbe4be5c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.105879 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-config-data" (OuterVolumeSpecName: "config-data") pod "75b432e5-2a1d-421d-ac63-202bbe4be5c5" (UID: "75b432e5-2a1d-421d-ac63-202bbe4be5c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.135119 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75b432e5-2a1d-421d-ac63-202bbe4be5c5" (UID: "75b432e5-2a1d-421d-ac63-202bbe4be5c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.163486 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.163529 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.163541 4781 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.163552 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.163562 4781 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.163573 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdcl6\" (UniqueName: \"kubernetes.io/projected/75b432e5-2a1d-421d-ac63-202bbe4be5c5-kube-api-access-pdcl6\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.258170 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fwppv" event={"ID":"75b432e5-2a1d-421d-ac63-202bbe4be5c5","Type":"ContainerDied","Data":"7610fc60d1158180f6a0fcb6c59bb1930e7a3d6fd2c319da87c248ff0413eb39"} Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.258219 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7610fc60d1158180f6a0fcb6c59bb1930e7a3d6fd2c319da87c248ff0413eb39" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.258287 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.361636 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fwppv"] Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.370950 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fwppv"] Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.471686 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gxj6b"] Feb 27 00:26:48 crc kubenswrapper[4781]: E0227 00:26:48.472154 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b432e5-2a1d-421d-ac63-202bbe4be5c5" containerName="keystone-bootstrap" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.472171 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b432e5-2a1d-421d-ac63-202bbe4be5c5" containerName="keystone-bootstrap" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.472347 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="75b432e5-2a1d-421d-ac63-202bbe4be5c5" containerName="keystone-bootstrap" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.472999 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.478443 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.478497 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.478525 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.478861 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nhgp" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.481433 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gxj6b"] Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.573145 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-scripts\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.573275 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh2sk\" (UniqueName: \"kubernetes.io/projected/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-kube-api-access-jh2sk\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.573446 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-config-data\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.573500 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-credential-keys\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.573578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-combined-ca-bundle\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.573619 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-fernet-keys\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.678112 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-config-data\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.678257 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-credential-keys\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.678297 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-combined-ca-bundle\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.678341 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-fernet-keys\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.678429 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-scripts\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.678556 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh2sk\" (UniqueName: \"kubernetes.io/projected/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-kube-api-access-jh2sk\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.683303 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-credential-keys\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.683849 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-fernet-keys\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.684266 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-config-data\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.684583 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-combined-ca-bundle\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.686666 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-scripts\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.698557 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh2sk\" (UniqueName: \"kubernetes.io/projected/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-kube-api-access-jh2sk\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.725596 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.725884 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" containerID="cri-o://490f54d4fc0654da6b5add2d9e470584271088a4fc9d0ff0972339bc97ab6f8f" gracePeriod=600 Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.725948 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="thanos-sidecar" containerID="cri-o://0d295c8666e863d2c0e4e0d3a3e33356c58f61c54e944f8ced4d911133124bc0" gracePeriod=600 Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.726036 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="config-reloader" containerID="cri-o://c6e860c6c62b63e5a5fe835a4877c45040a36e7fc332cce5af395a3eaa5e24b1" gracePeriod=600 Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.802656 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.225769 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.302886 4781 generic.go:334] "Generic (PLEG): container finished" podID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerID="0d295c8666e863d2c0e4e0d3a3e33356c58f61c54e944f8ced4d911133124bc0" exitCode=0 Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.302925 4781 generic.go:334] "Generic (PLEG): container finished" podID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerID="c6e860c6c62b63e5a5fe835a4877c45040a36e7fc332cce5af395a3eaa5e24b1" exitCode=0 Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.302943 4781 generic.go:334] "Generic (PLEG): container finished" podID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerID="490f54d4fc0654da6b5add2d9e470584271088a4fc9d0ff0972339bc97ab6f8f" exitCode=0 Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.302966 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerDied","Data":"0d295c8666e863d2c0e4e0d3a3e33356c58f61c54e944f8ced4d911133124bc0"} Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.303002 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerDied","Data":"c6e860c6c62b63e5a5fe835a4877c45040a36e7fc332cce5af395a3eaa5e24b1"} Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.303013 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerDied","Data":"490f54d4fc0654da6b5add2d9e470584271088a4fc9d0ff0972339bc97ab6f8f"} Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.356754 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75b432e5-2a1d-421d-ac63-202bbe4be5c5" path="/var/lib/kubelet/pods/75b432e5-2a1d-421d-ac63-202bbe4be5c5/volumes" Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.357352 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc44h"] Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.357559 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-gc44h" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" containerID="cri-o://e2bf980506549d387ee967a300bd50ff50a9e4489a44bcc2c952a5e2c00137a5" gracePeriod=10 Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.766684 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:50 crc kubenswrapper[4781]: I0227 00:26:50.315451 4781 generic.go:334] "Generic (PLEG): container finished" podID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerID="e2bf980506549d387ee967a300bd50ff50a9e4489a44bcc2c952a5e2c00137a5" exitCode=0 Feb 27 00:26:50 crc kubenswrapper[4781]: I0227 00:26:50.315488 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc44h" event={"ID":"8e37b0a7-69ac-439e-9c5a-207210fe40c8","Type":"ContainerDied","Data":"e2bf980506549d387ee967a300bd50ff50a9e4489a44bcc2c952a5e2c00137a5"} Feb 27 00:26:50 crc kubenswrapper[4781]: I0227 00:26:50.483951 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-gc44h" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Feb 27 00:26:51 crc kubenswrapper[4781]: I0227 00:26:51.096036 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.116:9090/-/ready\": dial tcp 10.217.0.116:9090: connect: connection refused" Feb 27 00:26:54 crc kubenswrapper[4781]: I0227 00:26:54.909958 4781 scope.go:117] "RemoveContainer" containerID="a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.484983 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-gc44h" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Feb 27 00:26:55 crc kubenswrapper[4781]: E0227 00:26:55.488273 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 27 00:26:55 crc kubenswrapper[4781]: E0227 00:26:55.488548 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lmp94,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-bf4zw_openstack(314ca901-3264-4136-b377-daad0075b72c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:26:55 crc kubenswrapper[4781]: E0227 00:26:55.489620 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-bf4zw" podUID="314ca901-3264-4136-b377-daad0075b72c" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.591686 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629251 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-combined-ca-bundle\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629445 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-scripts\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-logs\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629571 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt497\" (UniqueName: \"kubernetes.io/projected/ac31f36e-35c4-4f48-a05b-f49855052358-kube-api-access-wt497\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629600 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-config-data\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629655 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-httpd-run\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629785 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.630042 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-logs" (OuterVolumeSpecName: "logs") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.630219 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.630681 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.630701 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.633404 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-scripts" (OuterVolumeSpecName: "scripts") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.638830 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac31f36e-35c4-4f48-a05b-f49855052358-kube-api-access-wt497" (OuterVolumeSpecName: "kube-api-access-wt497") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "kube-api-access-wt497". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.645044 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549" (OuterVolumeSpecName: "glance") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "pvc-5bfae319-10bf-453e-8fc6-7da85b46e549". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.659856 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.688039 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-config-data" (OuterVolumeSpecName: "config-data") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.732651 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.733010 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.733020 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt497\" (UniqueName: \"kubernetes.io/projected/ac31f36e-35c4-4f48-a05b-f49855052358-kube-api-access-wt497\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.733030 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.733067 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") on node \"crc\" " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.766000 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.766152 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5bfae319-10bf-453e-8fc6-7da85b46e549" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549") on node "crc" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.835228 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.387345 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ac31f36e-35c4-4f48-a05b-f49855052358","Type":"ContainerDied","Data":"084b441c862a06da9a9c0b4c64f7175017dfb66c7fcb6d7f1ae1871790335c15"} Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.387368 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: E0227 00:26:56.390504 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-bf4zw" podUID="314ca901-3264-4136-b377-daad0075b72c" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.450285 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.462047 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.471369 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:56 crc kubenswrapper[4781]: E0227 00:26:56.471875 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-log" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.471898 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-log" Feb 27 00:26:56 crc kubenswrapper[4781]: E0227 00:26:56.471931 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-httpd" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.471941 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-httpd" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.472180 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-log" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.472214 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-httpd" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.473485 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.476658 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.478188 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.490774 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.652554 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653141 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653199 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653253 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653327 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653434 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc459\" (UniqueName: \"kubernetes.io/projected/6ef40468-5e47-4e34-a641-bfbe7803d480-kube-api-access-kc459\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653464 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653514 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.754793 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.754904 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.754959 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.754987 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.755016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.755060 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.755099 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc459\" (UniqueName: \"kubernetes.io/projected/6ef40468-5e47-4e34-a641-bfbe7803d480-kube-api-access-kc459\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.755124 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.755610 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.757612 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.759913 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.760049 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.760707 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.761254 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.761291 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a7b96405e17327882846f95b5adf8b290f3f24e0a3e5cf6d272cf20133e6cae4/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.764242 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.838043 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc459\" (UniqueName: \"kubernetes.io/projected/6ef40468-5e47-4e34-a641-bfbe7803d480-kube-api-access-kc459\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.859283 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:57 crc kubenswrapper[4781]: I0227 00:26:57.126813 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:57 crc kubenswrapper[4781]: I0227 00:26:57.320565 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" path="/var/lib/kubelet/pods/ac31f36e-35c4-4f48-a05b-f49855052358/volumes" Feb 27 00:26:59 crc kubenswrapper[4781]: I0227 00:26:59.095923 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.116:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:27:04 crc kubenswrapper[4781]: I0227 00:27:04.107749 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.116:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:27:04 crc kubenswrapper[4781]: I0227 00:27:04.109748 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:05 crc kubenswrapper[4781]: E0227 00:27:05.344939 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 27 00:27:05 crc kubenswrapper[4781]: E0227 00:27:05.346182 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cdh5ffh577h57dh66ch59h8chffh65ch575h67ch5b9hfbh544h9bh58ch64h696h95h67fh5dh64bh58fh665h64h8bh9h649h54ch565h5bdh66fq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9qwfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(c607f0bd-ab23-4fc5-8aa7-437be5e6d59d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.484554 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-gc44h" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.486695 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.495516 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerDied","Data":"2ad75abe5f1e9859dec62d9d7e1f0e4f7552fc881d371d2d01763329d31bdef8"} Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.495564 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ad75abe5f1e9859dec62d9d7e1f0e4f7552fc881d371d2d01763329d31bdef8" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.498370 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc44h" event={"ID":"8e37b0a7-69ac-439e-9c5a-207210fe40c8","Type":"ContainerDied","Data":"e1839c0058f09c92d633d8b44bcde9496faf128970e2a8993b81a296f21aac5b"} Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.498402 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1839c0058f09c92d633d8b44bcde9496faf128970e2a8993b81a296f21aac5b" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.581161 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.596718 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.646434 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jz5b\" (UniqueName: \"kubernetes.io/projected/8e37b0a7-69ac-439e-9c5a-207210fe40c8-kube-api-access-8jz5b\") pod \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.646508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-dns-svc\") pod \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.646639 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-nb\") pod \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.646682 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-config\") pod \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.646741 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-sb\") pod \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.681741 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e37b0a7-69ac-439e-9c5a-207210fe40c8-kube-api-access-8jz5b" (OuterVolumeSpecName: "kube-api-access-8jz5b") pod "8e37b0a7-69ac-439e-9c5a-207210fe40c8" (UID: "8e37b0a7-69ac-439e-9c5a-207210fe40c8"). InnerVolumeSpecName "kube-api-access-8jz5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.714596 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e37b0a7-69ac-439e-9c5a-207210fe40c8" (UID: "8e37b0a7-69ac-439e-9c5a-207210fe40c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.740170 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-config" (OuterVolumeSpecName: "config") pod "8e37b0a7-69ac-439e-9c5a-207210fe40c8" (UID: "8e37b0a7-69ac-439e-9c5a-207210fe40c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.741242 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e37b0a7-69ac-439e-9c5a-207210fe40c8" (UID: "8e37b0a7-69ac-439e-9c5a-207210fe40c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748152 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-2\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748278 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-web-config\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748322 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f85c54b-b800-429a-ba2d-fe22056ac907-config-out\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748517 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748578 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2945l\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-kube-api-access-2945l\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748614 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-thanos-prometheus-http-client-file\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748659 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-tls-assets\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748680 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748716 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-0\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748766 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-config\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748798 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-1\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749283 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749306 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749318 4781 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749330 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jz5b\" (UniqueName: \"kubernetes.io/projected/8e37b0a7-69ac-439e-9c5a-207210fe40c8-kube-api-access-8jz5b\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749341 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749838 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749871 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.753246 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f85c54b-b800-429a-ba2d-fe22056ac907-config-out" (OuterVolumeSpecName: "config-out") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.756862 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.759081 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-kube-api-access-2945l" (OuterVolumeSpecName: "kube-api-access-2945l") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "kube-api-access-2945l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.759864 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.762235 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-config" (OuterVolumeSpecName: "config") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.778183 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-web-config" (OuterVolumeSpecName: "web-config") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.790351 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.811334 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e37b0a7-69ac-439e-9c5a-207210fe40c8" (UID: "8e37b0a7-69ac-439e-9c5a-207210fe40c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.850966 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2945l\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-kube-api-access-2945l\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851021 4781 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851038 4781 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851053 4781 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851068 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851080 4781 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851090 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851101 4781 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-web-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851112 4781 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f85c54b-b800-429a-ba2d-fe22056ac907-config-out\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851156 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") on node \"crc\" " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.881034 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.881483 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6") on node "crc" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.952959 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.038835 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.511506 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f43ab5c-f862-468c-92c1-ec7366eb7ed0" containerID="9abe8ef3a48995708f20de72923495db036e6761eb107a6dfc8ea5dccc96bf58" exitCode=0 Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.511668 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.511731 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.511578 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bk54r" event={"ID":"3f43ab5c-f862-468c-92c1-ec7366eb7ed0","Type":"ContainerDied","Data":"9abe8ef3a48995708f20de72923495db036e6761eb107a6dfc8ea5dccc96bf58"} Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.561718 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc44h"] Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.573369 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc44h"] Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.585752 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.596183 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608380 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:27:06 crc kubenswrapper[4781]: E0227 00:27:06.608795 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="init" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608822 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="init" Feb 27 00:27:06 crc kubenswrapper[4781]: E0227 00:27:06.608846 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608853 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" Feb 27 00:27:06 crc kubenswrapper[4781]: E0227 00:27:06.608863 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="init-config-reloader" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608870 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="init-config-reloader" Feb 27 00:27:06 crc kubenswrapper[4781]: E0227 00:27:06.608877 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="thanos-sidecar" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608883 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="thanos-sidecar" Feb 27 00:27:06 crc kubenswrapper[4781]: E0227 00:27:06.608897 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="config-reloader" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608904 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="config-reloader" Feb 27 00:27:06 crc kubenswrapper[4781]: E0227 00:27:06.608918 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608923 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.609082 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="config-reloader" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.609092 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.609110 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.609120 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="thanos-sidecar" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.610739 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.613855 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.614022 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.614131 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.614253 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.614373 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.614680 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.615159 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-zmzb4" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.615257 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.617194 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.621292 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.668902 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll7tb\" (UniqueName: \"kubernetes.io/projected/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-kube-api-access-ll7tb\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.668943 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.668999 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669269 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669298 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669323 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669348 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669368 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-config\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669385 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669401 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669424 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669568 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669703 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771395 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771494 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771515 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771545 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771571 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771590 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-config\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771609 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771633 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771677 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771709 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771738 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771768 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll7tb\" (UniqueName: \"kubernetes.io/projected/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-kube-api-access-ll7tb\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771789 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.773304 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.773876 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.776088 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.777499 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.777541 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b26095f48a6799aae7472dc34ad76c7f8559a3fa84033df1f18203d2595242ed/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.777993 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.778518 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.778660 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.780235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.784045 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.784813 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.790298 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-config\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.790745 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.797643 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll7tb\" (UniqueName: \"kubernetes.io/projected/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-kube-api-access-ll7tb\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.820448 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.969034 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:07 crc kubenswrapper[4781]: E0227 00:27:07.271535 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 27 00:27:07 crc kubenswrapper[4781]: E0227 00:27:07.271719 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tvkqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9vlp4_openstack(aef65495-ecb2-4396-bb05-a4c5ee48f291): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:27:07 crc kubenswrapper[4781]: E0227 00:27:07.272841 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9vlp4" podUID="aef65495-ecb2-4396-bb05-a4c5ee48f291" Feb 27 00:27:07 crc kubenswrapper[4781]: I0227 00:27:07.320391 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" path="/var/lib/kubelet/pods/1f85c54b-b800-429a-ba2d-fe22056ac907/volumes" Feb 27 00:27:07 crc kubenswrapper[4781]: I0227 00:27:07.321259 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" path="/var/lib/kubelet/pods/8e37b0a7-69ac-439e-9c5a-207210fe40c8/volumes" Feb 27 00:27:07 crc kubenswrapper[4781]: E0227 00:27:07.525804 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-9vlp4" podUID="aef65495-ecb2-4396-bb05-a4c5ee48f291" Feb 27 00:27:10 crc kubenswrapper[4781]: I0227 00:27:10.486888 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-gc44h" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.274720 4781 scope.go:117] "RemoveContainer" containerID="4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e" Feb 27 00:27:11 crc kubenswrapper[4781]: E0227 00:27:11.275492 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e\": container with ID starting with 4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e not found: ID does not exist" containerID="4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.275519 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e"} err="failed to get container status \"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e\": rpc error: code = NotFound desc = could not find container \"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e\": container with ID starting with 4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e not found: ID does not exist" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.275542 4781 scope.go:117] "RemoveContainer" containerID="a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318" Feb 27 00:27:11 crc kubenswrapper[4781]: E0227 00:27:11.275937 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318\": container with ID starting with a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318 not found: ID does not exist" containerID="a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.275956 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318"} err="failed to get container status \"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318\": rpc error: code = NotFound desc = could not find container \"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318\": container with ID starting with a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318 not found: ID does not exist" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.275970 4781 scope.go:117] "RemoveContainer" containerID="4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.276255 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e"} err="failed to get container status \"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e\": rpc error: code = NotFound desc = could not find container \"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e\": container with ID starting with 4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e not found: ID does not exist" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.276279 4781 scope.go:117] "RemoveContainer" containerID="a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.276541 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318"} err="failed to get container status \"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318\": rpc error: code = NotFound desc = could not find container \"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318\": container with ID starting with a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318 not found: ID does not exist" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.276566 4781 scope.go:117] "RemoveContainer" containerID="2a9c0dd16ab1b03b070571c17e79b75425373239b1c45b1c5b15a3b9a4d8f4b1" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.430255 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bk54r" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.456332 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzkc9\" (UniqueName: \"kubernetes.io/projected/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-kube-api-access-lzkc9\") pod \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.456421 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-combined-ca-bundle\") pod \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.456532 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-config\") pod \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.467451 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-kube-api-access-lzkc9" (OuterVolumeSpecName: "kube-api-access-lzkc9") pod "3f43ab5c-f862-468c-92c1-ec7366eb7ed0" (UID: "3f43ab5c-f862-468c-92c1-ec7366eb7ed0"). InnerVolumeSpecName "kube-api-access-lzkc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.500173 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-config" (OuterVolumeSpecName: "config") pod "3f43ab5c-f862-468c-92c1-ec7366eb7ed0" (UID: "3f43ab5c-f862-468c-92c1-ec7366eb7ed0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.500813 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f43ab5c-f862-468c-92c1-ec7366eb7ed0" (UID: "3f43ab5c-f862-468c-92c1-ec7366eb7ed0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.556444 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8b7774-c640-416d-82a4-535fee88a47b","Type":"ContainerStarted","Data":"5275ce5209350a6beca9364d6baa1757ba1b2bb302e2e2d5d8f5780ac3a4ca75"} Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.558314 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzkc9\" (UniqueName: \"kubernetes.io/projected/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-kube-api-access-lzkc9\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.558342 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.558356 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.560229 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bk54r" event={"ID":"3f43ab5c-f862-468c-92c1-ec7366eb7ed0","Type":"ContainerDied","Data":"f4907a514c133717f2dd463877fdc9d6b4b6535ee45f8865a1b93ba48242fe73"} Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.560261 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4907a514c133717f2dd463877fdc9d6b4b6535ee45f8865a1b93ba48242fe73" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.560285 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bk54r" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.203735 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.282206 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gxj6b"] Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.716012 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bp4v9"] Feb 27 00:27:12 crc kubenswrapper[4781]: E0227 00:27:12.716715 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f43ab5c-f862-468c-92c1-ec7366eb7ed0" containerName="neutron-db-sync" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.716728 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f43ab5c-f862-468c-92c1-ec7366eb7ed0" containerName="neutron-db-sync" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.716903 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f43ab5c-f862-468c-92c1-ec7366eb7ed0" containerName="neutron-db-sync" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.721145 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.748726 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bp4v9"] Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.765551 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.799213 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.799297 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-config\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.799391 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.799495 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxql5\" (UniqueName: \"kubernetes.io/projected/dd15e642-6664-416f-ac4e-9cddc96e5642-kube-api-access-rxql5\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.799520 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.799555 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-svc\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.808164 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5445c56cbd-fmcjz"] Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.810063 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.821174 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.821493 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.821719 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-d4ppr" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.821858 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.827412 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5445c56cbd-fmcjz"] Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.897901 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.897959 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901341 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-config\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901433 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901465 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-config\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901497 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-ovndb-tls-certs\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901533 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b22wp\" (UniqueName: \"kubernetes.io/projected/34294cdd-a18f-4453-8d43-c4d1290e3c59-kube-api-access-b22wp\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901566 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-httpd-config\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901584 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxql5\" (UniqueName: \"kubernetes.io/projected/dd15e642-6664-416f-ac4e-9cddc96e5642-kube-api-access-rxql5\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901606 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901657 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-svc\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901687 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901709 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-combined-ca-bundle\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.902763 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.902783 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-config\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.903310 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.903354 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.908410 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-svc\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.939196 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxql5\" (UniqueName: \"kubernetes.io/projected/dd15e642-6664-416f-ac4e-9cddc96e5642-kube-api-access-rxql5\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.002937 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b22wp\" (UniqueName: \"kubernetes.io/projected/34294cdd-a18f-4453-8d43-c4d1290e3c59-kube-api-access-b22wp\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.003025 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-httpd-config\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.003100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-combined-ca-bundle\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.003180 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-config\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.003212 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-ovndb-tls-certs\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.008434 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-combined-ca-bundle\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.009690 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-httpd-config\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.013337 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-ovndb-tls-certs\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.022410 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-config\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.025256 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b22wp\" (UniqueName: \"kubernetes.io/projected/34294cdd-a18f-4453-8d43-c4d1290e3c59-kube-api-access-b22wp\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.065334 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.136653 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.696379 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b48494fc7-447pr"] Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.701075 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.703358 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.703969 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.718097 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b48494fc7-447pr"] Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742477 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-httpd-config\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742522 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-ovndb-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742539 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-public-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742617 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-combined-ca-bundle\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742668 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4zg7\" (UniqueName: \"kubernetes.io/projected/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-kube-api-access-h4zg7\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742688 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-config\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742709 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-internal-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845526 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-combined-ca-bundle\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845606 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4zg7\" (UniqueName: \"kubernetes.io/projected/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-kube-api-access-h4zg7\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845665 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-config\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845714 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-internal-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845833 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-httpd-config\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845872 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-ovndb-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845892 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-public-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.852126 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-httpd-config\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.853769 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-combined-ca-bundle\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.854025 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-internal-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.855061 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-config\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.860183 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-ovndb-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.863615 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-public-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.863989 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4zg7\" (UniqueName: \"kubernetes.io/projected/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-kube-api-access-h4zg7\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:15 crc kubenswrapper[4781]: I0227 00:27:15.025737 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:19 crc kubenswrapper[4781]: I0227 00:27:19.903470 4781 scope.go:117] "RemoveContainer" containerID="af130dc6503472cd229a6073407a944eb9f345fe510e3a5882815f2cc79c8dc8" Feb 27 00:27:20 crc kubenswrapper[4781]: I0227 00:27:20.640415 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ef40468-5e47-4e34-a641-bfbe7803d480","Type":"ContainerStarted","Data":"7d0ca3340d609e18433fc291df1d484624d9e133542d96a4dff1a09c6cf6905a"} Feb 27 00:27:20 crc kubenswrapper[4781]: I0227 00:27:20.643858 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gxj6b" event={"ID":"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1","Type":"ContainerStarted","Data":"cd1310454f14cbb6fa301146043553cf2eabbe6f919a1570a19e8768d9fd1b5d"} Feb 27 00:27:20 crc kubenswrapper[4781]: I0227 00:27:20.645057 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f","Type":"ContainerStarted","Data":"07e4f346c30153beb2d7f86fae70b693d729de2b22f5a27f4024b9039dd8a05a"} Feb 27 00:27:23 crc kubenswrapper[4781]: E0227 00:27:23.769543 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 27 00:27:23 crc kubenswrapper[4781]: E0227 00:27:23.770115 4781 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 27 00:27:23 crc kubenswrapper[4781]: E0227 00:27:23.770248 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bwsv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-l9w6z_openstack(2274af64-0743-4ede-8fb8-e2ed801638ac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:27:23 crc kubenswrapper[4781]: E0227 00:27:23.771454 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-l9w6z" podUID="2274af64-0743-4ede-8fb8-e2ed801638ac" Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.314252 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5445c56cbd-fmcjz"] Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.528272 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b48494fc7-447pr"] Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.603690 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bp4v9"] Feb 27 00:27:24 crc kubenswrapper[4781]: W0227 00:27:24.639832 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd15e642_6664_416f_ac4e_9cddc96e5642.slice/crio-9e67430f08589dcb2cfca360edd38ce35b2b7fe28eecbb76ca402ae3e309ab2c WatchSource:0}: Error finding container 9e67430f08589dcb2cfca360edd38ce35b2b7fe28eecbb76ca402ae3e309ab2c: Status 404 returned error can't find the container with id 9e67430f08589dcb2cfca360edd38ce35b2b7fe28eecbb76ca402ae3e309ab2c Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.712910 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5445c56cbd-fmcjz" event={"ID":"34294cdd-a18f-4453-8d43-c4d1290e3c59","Type":"ContainerStarted","Data":"0384541fca62a0c17aeca1e73d81a12f432aa7f744f83cfe8433a7d935539961"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.712968 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5445c56cbd-fmcjz" event={"ID":"34294cdd-a18f-4453-8d43-c4d1290e3c59","Type":"ContainerStarted","Data":"35abaddf64ded29044d57543bd49dba6fb7cc622e405ec56e6449b1f79234b7a"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.715083 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jqsnp" event={"ID":"a3fa4251-dd48-417b-8002-6df02d3d3dac","Type":"ContainerStarted","Data":"3dc1eb7dbdd6694e7292463c3972ed88e476b4fd179d083eaeff0cf57f961958"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.718153 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b48494fc7-447pr" event={"ID":"2138a247-a569-4ed6-91a9-5dde2a0b5fa9","Type":"ContainerStarted","Data":"8cfc8b26590e03ab4b9d1a7221cd85bef307e38eb533c1221abe3eafc0089adc"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.721960 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerStarted","Data":"a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.726837 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gxj6b" event={"ID":"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1","Type":"ContainerStarted","Data":"d4ee7796e64f1964f0ab74414c33a59e4f95e98e4eb4a260e730590563ac50fe"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.737924 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jqsnp" podStartSLOduration=32.429275189 podStartE2EDuration="46.737903694s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="2026-02-27 00:26:41.17601959 +0000 UTC m=+1270.433559144" lastFinishedPulling="2026-02-27 00:26:55.484648055 +0000 UTC m=+1284.742187649" observedRunningTime="2026-02-27 00:27:24.733838306 +0000 UTC m=+1313.991377860" watchObservedRunningTime="2026-02-27 00:27:24.737903694 +0000 UTC m=+1313.995443248" Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.752887 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bf4zw" event={"ID":"314ca901-3264-4136-b377-daad0075b72c","Type":"ContainerStarted","Data":"89638f7647330ea3c5230d3d253e70beeda178adf35863cd73f9bfed5a1f6c4c"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.753068 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gxj6b" podStartSLOduration=36.753058564 podStartE2EDuration="36.753058564s" podCreationTimestamp="2026-02-27 00:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:24.752111899 +0000 UTC m=+1314.009651453" watchObservedRunningTime="2026-02-27 00:27:24.753058564 +0000 UTC m=+1314.010598118" Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.760264 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" event={"ID":"dd15e642-6664-416f-ac4e-9cddc96e5642","Type":"ContainerStarted","Data":"9e67430f08589dcb2cfca360edd38ce35b2b7fe28eecbb76ca402ae3e309ab2c"} Feb 27 00:27:24 crc kubenswrapper[4781]: E0227 00:27:24.761512 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-l9w6z" podUID="2274af64-0743-4ede-8fb8-e2ed801638ac" Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.776764 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-bf4zw" podStartSLOduration=3.421811665 podStartE2EDuration="46.77674243s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="2026-02-27 00:26:40.582292875 +0000 UTC m=+1269.839832429" lastFinishedPulling="2026-02-27 00:27:23.93722364 +0000 UTC m=+1313.194763194" observedRunningTime="2026-02-27 00:27:24.772031935 +0000 UTC m=+1314.029571489" watchObservedRunningTime="2026-02-27 00:27:24.77674243 +0000 UTC m=+1314.034282004" Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.772214 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8b7774-c640-416d-82a4-535fee88a47b","Type":"ContainerStarted","Data":"6b3c74fbdf3287fb50b684f9cb5119d529b541e265cc750ba76a4e2cbc2b36ce"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.773959 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ef40468-5e47-4e34-a641-bfbe7803d480","Type":"ContainerStarted","Data":"1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.776974 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b48494fc7-447pr" event={"ID":"2138a247-a569-4ed6-91a9-5dde2a0b5fa9","Type":"ContainerStarted","Data":"d7c09d305d22e97d0875bde304e390f511aac9300a440daba221eab217d0ec4d"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.777018 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b48494fc7-447pr" event={"ID":"2138a247-a569-4ed6-91a9-5dde2a0b5fa9","Type":"ContainerStarted","Data":"994246fa04a777c2f0ceb85d5b3e476072c41f89030472fc48f602b083a3eada"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.777056 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.780351 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9vlp4" event={"ID":"aef65495-ecb2-4396-bb05-a4c5ee48f291","Type":"ContainerStarted","Data":"7d9a07674537261cb97d86282370b22b357712af922b31aea2a8cfe67e8a0a4c"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.781911 4781 generic.go:334] "Generic (PLEG): container finished" podID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerID="90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5" exitCode=0 Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.781962 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" event={"ID":"dd15e642-6664-416f-ac4e-9cddc96e5642","Type":"ContainerDied","Data":"90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.786229 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5445c56cbd-fmcjz" event={"ID":"34294cdd-a18f-4453-8d43-c4d1290e3c59","Type":"ContainerStarted","Data":"24cfe8100e51fc567495df6c5f9d60a27bc0381d4315fb54a1ac7e37d2a6bf89"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.786607 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.806737 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b48494fc7-447pr" podStartSLOduration=11.806714499 podStartE2EDuration="11.806714499s" podCreationTimestamp="2026-02-27 00:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:25.803377151 +0000 UTC m=+1315.060916705" watchObservedRunningTime="2026-02-27 00:27:25.806714499 +0000 UTC m=+1315.064254053" Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.839618 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5445c56cbd-fmcjz" podStartSLOduration=13.839594868 podStartE2EDuration="13.839594868s" podCreationTimestamp="2026-02-27 00:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:25.824245282 +0000 UTC m=+1315.081784836" watchObservedRunningTime="2026-02-27 00:27:25.839594868 +0000 UTC m=+1315.097134422" Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.863268 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9vlp4" podStartSLOduration=4.343900944 podStartE2EDuration="47.863248843s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="2026-02-27 00:26:40.514584526 +0000 UTC m=+1269.772124080" lastFinishedPulling="2026-02-27 00:27:24.033932425 +0000 UTC m=+1313.291471979" observedRunningTime="2026-02-27 00:27:25.854482671 +0000 UTC m=+1315.112022225" watchObservedRunningTime="2026-02-27 00:27:25.863248843 +0000 UTC m=+1315.120788397" Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.802241 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8b7774-c640-416d-82a4-535fee88a47b","Type":"ContainerStarted","Data":"19709bfd53c352e0c185741ee4fec3ea205eb88bb82c12ed0931a9d4525701e0"} Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.802370 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-log" containerID="cri-o://6b3c74fbdf3287fb50b684f9cb5119d529b541e265cc750ba76a4e2cbc2b36ce" gracePeriod=30 Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.802660 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-httpd" containerID="cri-o://19709bfd53c352e0c185741ee4fec3ea205eb88bb82c12ed0931a9d4525701e0" gracePeriod=30 Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.810643 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" event={"ID":"dd15e642-6664-416f-ac4e-9cddc96e5642","Type":"ContainerStarted","Data":"17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf"} Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.811555 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.816160 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ef40468-5e47-4e34-a641-bfbe7803d480","Type":"ContainerStarted","Data":"16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614"} Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.839032 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=41.839013122 podStartE2EDuration="41.839013122s" podCreationTimestamp="2026-02-27 00:26:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:26.824809736 +0000 UTC m=+1316.082349310" watchObservedRunningTime="2026-02-27 00:27:26.839013122 +0000 UTC m=+1316.096552676" Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.846101 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" podStartSLOduration=14.846083848 podStartE2EDuration="14.846083848s" podCreationTimestamp="2026-02-27 00:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:26.844775914 +0000 UTC m=+1316.102315468" watchObservedRunningTime="2026-02-27 00:27:26.846083848 +0000 UTC m=+1316.103623402" Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.863497 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=30.863478868 podStartE2EDuration="30.863478868s" podCreationTimestamp="2026-02-27 00:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:26.860276603 +0000 UTC m=+1316.117816157" watchObservedRunningTime="2026-02-27 00:27:26.863478868 +0000 UTC m=+1316.121018422" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.127704 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.127761 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.127776 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.127922 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.230430 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.233090 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.852920 4781 generic.go:334] "Generic (PLEG): container finished" podID="fe8b7774-c640-416d-82a4-535fee88a47b" containerID="19709bfd53c352e0c185741ee4fec3ea205eb88bb82c12ed0931a9d4525701e0" exitCode=0 Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.853176 4781 generic.go:334] "Generic (PLEG): container finished" podID="fe8b7774-c640-416d-82a4-535fee88a47b" containerID="6b3c74fbdf3287fb50b684f9cb5119d529b541e265cc750ba76a4e2cbc2b36ce" exitCode=143 Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.853241 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8b7774-c640-416d-82a4-535fee88a47b","Type":"ContainerDied","Data":"19709bfd53c352e0c185741ee4fec3ea205eb88bb82c12ed0931a9d4525701e0"} Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.853268 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8b7774-c640-416d-82a4-535fee88a47b","Type":"ContainerDied","Data":"6b3c74fbdf3287fb50b684f9cb5119d529b541e265cc750ba76a4e2cbc2b36ce"} Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.890820 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f","Type":"ContainerStarted","Data":"69c8105b8323ade72457bd14497f65a1085bea33148e01ee2cbcfb2de3687cdf"} Feb 27 00:27:28 crc kubenswrapper[4781]: I0227 00:27:28.916131 4781 generic.go:334] "Generic (PLEG): container finished" podID="b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" containerID="d4ee7796e64f1964f0ab74414c33a59e4f95e98e4eb4a260e730590563ac50fe" exitCode=0 Feb 27 00:27:28 crc kubenswrapper[4781]: I0227 00:27:28.917025 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gxj6b" event={"ID":"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1","Type":"ContainerDied","Data":"d4ee7796e64f1964f0ab74414c33a59e4f95e98e4eb4a260e730590563ac50fe"} Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.177657 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.268800 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.268857 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-config-data\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.268903 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-logs\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.268970 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-scripts\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.269040 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-combined-ca-bundle\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.269114 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-httpd-run\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.269154 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/fe8b7774-c640-416d-82a4-535fee88a47b-kube-api-access-8vxcl\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.270694 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-logs" (OuterVolumeSpecName: "logs") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.270965 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.281013 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe8b7774-c640-416d-82a4-535fee88a47b-kube-api-access-8vxcl" (OuterVolumeSpecName: "kube-api-access-8vxcl") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "kube-api-access-8vxcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.287936 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-scripts" (OuterVolumeSpecName: "scripts") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.288908 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b" (OuterVolumeSpecName: "glance") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.303668 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.320518 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-config-data" (OuterVolumeSpecName: "config-data") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371814 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371854 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371871 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371880 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/fe8b7774-c640-416d-82a4-535fee88a47b-kube-api-access-8vxcl\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371910 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") on node \"crc\" " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371924 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371933 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.408650 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.409146 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b") on node "crc" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.474528 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.929719 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3fa4251-dd48-417b-8002-6df02d3d3dac" containerID="3dc1eb7dbdd6694e7292463c3972ed88e476b4fd179d083eaeff0cf57f961958" exitCode=0 Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.929789 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jqsnp" event={"ID":"a3fa4251-dd48-417b-8002-6df02d3d3dac","Type":"ContainerDied","Data":"3dc1eb7dbdd6694e7292463c3972ed88e476b4fd179d083eaeff0cf57f961958"} Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.934753 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.934749 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8b7774-c640-416d-82a4-535fee88a47b","Type":"ContainerDied","Data":"5275ce5209350a6beca9364d6baa1757ba1b2bb302e2e2d5d8f5780ac3a4ca75"} Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.934854 4781 scope.go:117] "RemoveContainer" containerID="19709bfd53c352e0c185741ee4fec3ea205eb88bb82c12ed0931a9d4525701e0" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.998130 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.016753 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.028930 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:27:30 crc kubenswrapper[4781]: E0227 00:27:30.029352 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-log" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.029368 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-log" Feb 27 00:27:30 crc kubenswrapper[4781]: E0227 00:27:30.029384 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-httpd" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.029391 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-httpd" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.029584 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-httpd" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.029601 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-log" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.030660 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.033123 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.040272 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.045018 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.185970 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-logs\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186069 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-scripts\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186113 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186159 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186189 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-config-data\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186216 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pftsh\" (UniqueName: \"kubernetes.io/projected/cb47b6b2-760a-4899-84f6-fdf1bd62a418-kube-api-access-pftsh\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186273 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186362 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287536 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-logs\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287616 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-scripts\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287681 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287710 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287730 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-config-data\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287746 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pftsh\" (UniqueName: \"kubernetes.io/projected/cb47b6b2-760a-4899-84f6-fdf1bd62a418-kube-api-access-pftsh\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287788 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.289379 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-logs\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.291601 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.291637 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5d3045414bd1cd74ec61e0394ba262493610c57a87bbc940ef275e8fc1bc2ecf/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.292728 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.296446 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-scripts\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.299140 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.306214 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.306739 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-config-data\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.307307 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pftsh\" (UniqueName: \"kubernetes.io/projected/cb47b6b2-760a-4899-84f6-fdf1bd62a418-kube-api-access-pftsh\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.341337 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.356346 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.948367 4781 generic.go:334] "Generic (PLEG): container finished" podID="314ca901-3264-4136-b377-daad0075b72c" containerID="89638f7647330ea3c5230d3d253e70beeda178adf35863cd73f9bfed5a1f6c4c" exitCode=0 Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.948405 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bf4zw" event={"ID":"314ca901-3264-4136-b377-daad0075b72c","Type":"ContainerDied","Data":"89638f7647330ea3c5230d3d253e70beeda178adf35863cd73f9bfed5a1f6c4c"} Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.333803 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" path="/var/lib/kubelet/pods/fe8b7774-c640-416d-82a4-535fee88a47b/volumes" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.419918 4781 scope.go:117] "RemoveContainer" containerID="6b3c74fbdf3287fb50b684f9cb5119d529b541e265cc750ba76a4e2cbc2b36ce" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.636959 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.642870 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jqsnp" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719052 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-combined-ca-bundle\") pod \"a3fa4251-dd48-417b-8002-6df02d3d3dac\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719154 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-config-data\") pod \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719182 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-credential-keys\") pod \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719221 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-scripts\") pod \"a3fa4251-dd48-417b-8002-6df02d3d3dac\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719237 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-scripts\") pod \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719368 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh2sk\" (UniqueName: \"kubernetes.io/projected/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-kube-api-access-jh2sk\") pod \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719410 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-config-data\") pod \"a3fa4251-dd48-417b-8002-6df02d3d3dac\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719442 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwdbp\" (UniqueName: \"kubernetes.io/projected/a3fa4251-dd48-417b-8002-6df02d3d3dac-kube-api-access-pwdbp\") pod \"a3fa4251-dd48-417b-8002-6df02d3d3dac\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719472 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3fa4251-dd48-417b-8002-6df02d3d3dac-logs\") pod \"a3fa4251-dd48-417b-8002-6df02d3d3dac\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719500 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-combined-ca-bundle\") pod \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719538 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-fernet-keys\") pod \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.720518 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3fa4251-dd48-417b-8002-6df02d3d3dac-logs" (OuterVolumeSpecName: "logs") pod "a3fa4251-dd48-417b-8002-6df02d3d3dac" (UID: "a3fa4251-dd48-417b-8002-6df02d3d3dac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.728143 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3fa4251-dd48-417b-8002-6df02d3d3dac-kube-api-access-pwdbp" (OuterVolumeSpecName: "kube-api-access-pwdbp") pod "a3fa4251-dd48-417b-8002-6df02d3d3dac" (UID: "a3fa4251-dd48-417b-8002-6df02d3d3dac"). InnerVolumeSpecName "kube-api-access-pwdbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.730811 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-scripts" (OuterVolumeSpecName: "scripts") pod "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" (UID: "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.733019 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" (UID: "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.733712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-scripts" (OuterVolumeSpecName: "scripts") pod "a3fa4251-dd48-417b-8002-6df02d3d3dac" (UID: "a3fa4251-dd48-417b-8002-6df02d3d3dac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.734598 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-kube-api-access-jh2sk" (OuterVolumeSpecName: "kube-api-access-jh2sk") pod "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" (UID: "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1"). InnerVolumeSpecName "kube-api-access-jh2sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.736873 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" (UID: "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.755916 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-config-data" (OuterVolumeSpecName: "config-data") pod "a3fa4251-dd48-417b-8002-6df02d3d3dac" (UID: "a3fa4251-dd48-417b-8002-6df02d3d3dac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.756755 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" (UID: "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.757905 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3fa4251-dd48-417b-8002-6df02d3d3dac" (UID: "a3fa4251-dd48-417b-8002-6df02d3d3dac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.759186 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-config-data" (OuterVolumeSpecName: "config-data") pod "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" (UID: "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821788 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821845 4781 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821869 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821890 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821907 4781 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821922 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821936 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821950 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh2sk\" (UniqueName: \"kubernetes.io/projected/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-kube-api-access-jh2sk\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821969 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821986 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwdbp\" (UniqueName: \"kubernetes.io/projected/a3fa4251-dd48-417b-8002-6df02d3d3dac-kube-api-access-pwdbp\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.822000 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3fa4251-dd48-417b-8002-6df02d3d3dac-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.963865 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jqsnp" event={"ID":"a3fa4251-dd48-417b-8002-6df02d3d3dac","Type":"ContainerDied","Data":"6da65166fa2a15c764d849696d3e6b0686802ef8180c50248f4b03677850887a"} Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.965530 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6da65166fa2a15c764d849696d3e6b0686802ef8180c50248f4b03677850887a" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.964201 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jqsnp" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.968574 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerStarted","Data":"b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717"} Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.978774 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.980035 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gxj6b" event={"ID":"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1","Type":"ContainerDied","Data":"cd1310454f14cbb6fa301146043553cf2eabbe6f919a1570a19e8768d9fd1b5d"} Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.980080 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd1310454f14cbb6fa301146043553cf2eabbe6f919a1570a19e8768d9fd1b5d" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.031396 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:27:32 crc kubenswrapper[4781]: W0227 00:27:32.074948 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb47b6b2_760a_4899_84f6_fdf1bd62a418.slice/crio-866ec8dc8dd6eea2cbe5498cddcd7820b1b7a00e1ecb5ebf3196c3d57588106d WatchSource:0}: Error finding container 866ec8dc8dd6eea2cbe5498cddcd7820b1b7a00e1ecb5ebf3196c3d57588106d: Status 404 returned error can't find the container with id 866ec8dc8dd6eea2cbe5498cddcd7820b1b7a00e1ecb5ebf3196c3d57588106d Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.139493 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-76c479bbf8-lkpd7"] Feb 27 00:27:32 crc kubenswrapper[4781]: E0227 00:27:32.140014 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3fa4251-dd48-417b-8002-6df02d3d3dac" containerName="placement-db-sync" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.140036 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fa4251-dd48-417b-8002-6df02d3d3dac" containerName="placement-db-sync" Feb 27 00:27:32 crc kubenswrapper[4781]: E0227 00:27:32.140068 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" containerName="keystone-bootstrap" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.140078 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" containerName="keystone-bootstrap" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.140421 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" containerName="keystone-bootstrap" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.140472 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3fa4251-dd48-417b-8002-6df02d3d3dac" containerName="placement-db-sync" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.142000 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.150268 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.150610 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.150844 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.151052 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7kxfw" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.152144 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.153968 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76c479bbf8-lkpd7"] Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232147 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c297e1-af3e-46d6-9738-8e6833deaf02-logs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232196 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-config-data\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-combined-ca-bundle\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232432 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnnv4\" (UniqueName: \"kubernetes.io/projected/33c297e1-af3e-46d6-9738-8e6833deaf02-kube-api-access-jnnv4\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232512 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-public-tls-certs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232617 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-internal-tls-certs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232723 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-scripts\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335550 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-scripts\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335704 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c297e1-af3e-46d6-9738-8e6833deaf02-logs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335732 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-config-data\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335771 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-combined-ca-bundle\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335819 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnnv4\" (UniqueName: \"kubernetes.io/projected/33c297e1-af3e-46d6-9738-8e6833deaf02-kube-api-access-jnnv4\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335844 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-public-tls-certs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335881 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-internal-tls-certs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.339761 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c297e1-af3e-46d6-9738-8e6833deaf02-logs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.342812 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-scripts\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.344243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-internal-tls-certs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.344304 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-config-data\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.344303 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-public-tls-certs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.344911 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-combined-ca-bundle\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.345653 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.354470 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnnv4\" (UniqueName: \"kubernetes.io/projected/33c297e1-af3e-46d6-9738-8e6833deaf02-kube-api-access-jnnv4\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.437583 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmp94\" (UniqueName: \"kubernetes.io/projected/314ca901-3264-4136-b377-daad0075b72c-kube-api-access-lmp94\") pod \"314ca901-3264-4136-b377-daad0075b72c\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.437884 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-db-sync-config-data\") pod \"314ca901-3264-4136-b377-daad0075b72c\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.437950 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-combined-ca-bundle\") pod \"314ca901-3264-4136-b377-daad0075b72c\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.442725 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314ca901-3264-4136-b377-daad0075b72c-kube-api-access-lmp94" (OuterVolumeSpecName: "kube-api-access-lmp94") pod "314ca901-3264-4136-b377-daad0075b72c" (UID: "314ca901-3264-4136-b377-daad0075b72c"). InnerVolumeSpecName "kube-api-access-lmp94". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.444691 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "314ca901-3264-4136-b377-daad0075b72c" (UID: "314ca901-3264-4136-b377-daad0075b72c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.460296 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.476802 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "314ca901-3264-4136-b377-daad0075b72c" (UID: "314ca901-3264-4136-b377-daad0075b72c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.540822 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmp94\" (UniqueName: \"kubernetes.io/projected/314ca901-3264-4136-b377-daad0075b72c-kube-api-access-lmp94\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.540850 4781 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.540859 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.887300 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-56459cf68c-4q7c8"] Feb 27 00:27:32 crc kubenswrapper[4781]: E0227 00:27:32.888253 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314ca901-3264-4136-b377-daad0075b72c" containerName="barbican-db-sync" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.888293 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="314ca901-3264-4136-b377-daad0075b72c" containerName="barbican-db-sync" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.888602 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="314ca901-3264-4136-b377-daad0075b72c" containerName="barbican-db-sync" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.889987 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.896028 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.896278 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.897025 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nhgp" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.897181 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.897372 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.897608 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56459cf68c-4q7c8"] Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.901051 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951340 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-public-tls-certs\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951386 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-fernet-keys\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951442 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-combined-ca-bundle\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951479 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx5b7\" (UniqueName: \"kubernetes.io/projected/2467458a-476f-460f-a6ce-144d7304476d-kube-api-access-sx5b7\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951506 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-scripts\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951525 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-config-data\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951539 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-internal-tls-certs\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951580 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-credential-keys\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.976205 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76c479bbf8-lkpd7"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.007229 4781 generic.go:334] "Generic (PLEG): container finished" podID="aef65495-ecb2-4396-bb05-a4c5ee48f291" containerID="7d9a07674537261cb97d86282370b22b357712af922b31aea2a8cfe67e8a0a4c" exitCode=0 Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.007322 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9vlp4" event={"ID":"aef65495-ecb2-4396-bb05-a4c5ee48f291","Type":"ContainerDied","Data":"7d9a07674537261cb97d86282370b22b357712af922b31aea2a8cfe67e8a0a4c"} Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.011883 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb47b6b2-760a-4899-84f6-fdf1bd62a418","Type":"ContainerStarted","Data":"5483ad8c7ab58752a9371dfb8baad38f002e9c4bc521ec62a86d28db8755aca8"} Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.011933 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb47b6b2-760a-4899-84f6-fdf1bd62a418","Type":"ContainerStarted","Data":"866ec8dc8dd6eea2cbe5498cddcd7820b1b7a00e1ecb5ebf3196c3d57588106d"} Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.014846 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bf4zw" event={"ID":"314ca901-3264-4136-b377-daad0075b72c","Type":"ContainerDied","Data":"0dfa44d37d2f64ae96d38dcaa27616ed0a623f908ad45d0066876fbf98be36ee"} Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.014864 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dfa44d37d2f64ae96d38dcaa27616ed0a623f908ad45d0066876fbf98be36ee" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.014892 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053321 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-public-tls-certs\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053365 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-fernet-keys\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053419 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-combined-ca-bundle\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053457 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx5b7\" (UniqueName: \"kubernetes.io/projected/2467458a-476f-460f-a6ce-144d7304476d-kube-api-access-sx5b7\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053503 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-scripts\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053525 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-config-data\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053540 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-internal-tls-certs\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-credential-keys\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.057258 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-credential-keys\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.057848 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-scripts\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.057885 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-fernet-keys\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.058656 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-config-data\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.059453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-public-tls-certs\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.059985 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-internal-tls-certs\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.061983 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-combined-ca-bundle\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.066811 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.075354 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx5b7\" (UniqueName: \"kubernetes.io/projected/2467458a-476f-460f-a6ce-144d7304476d-kube-api-access-sx5b7\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.140415 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5d6jk"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.141008 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" podUID="555d083f-48ec-4cf2-922f-211c99af51be" containerName="dnsmasq-dns" containerID="cri-o://4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2" gracePeriod=10 Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.236434 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.430708 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6fff4854c8-ttzsm"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.433581 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6fff4854c8-ttzsm"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.433608 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7dd7c6f4ff-m4d2l"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.434343 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.438698 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7dd7c6f4ff-m4d2l"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.438779 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.440287 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2j295" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.440431 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.440551 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.444851 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m576l"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.446505 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.452750 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.456638 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m576l"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572054 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj629\" (UniqueName: \"kubernetes.io/projected/f92df023-2e4a-495e-bbef-4a043c661f46-kube-api-access-qj629\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572404 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-combined-ca-bundle\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572437 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-config-data-custom\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-config\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572498 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572516 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-config-data\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572575 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572600 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-config-data-custom\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572648 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41039943-96a7-4fe6-8b66-0d64cd12a1fa-logs\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572667 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572693 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572737 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-config-data\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572769 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tldtl\" (UniqueName: \"kubernetes.io/projected/41039943-96a7-4fe6-8b66-0d64cd12a1fa-kube-api-access-tldtl\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572843 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92df023-2e4a-495e-bbef-4a043c661f46-logs\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572893 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twcwm\" (UniqueName: \"kubernetes.io/projected/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-kube-api-access-twcwm\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572972 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-combined-ca-bundle\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.588807 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5fbbfd856b-vgvjg"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.591136 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.593857 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.601285 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fbbfd856b-vgvjg"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.675873 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676207 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-config-data-custom\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676238 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676254 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41039943-96a7-4fe6-8b66-0d64cd12a1fa-logs\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676279 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-config-data\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676295 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676323 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tldtl\" (UniqueName: \"kubernetes.io/projected/41039943-96a7-4fe6-8b66-0d64cd12a1fa-kube-api-access-tldtl\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676397 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92df023-2e4a-495e-bbef-4a043c661f46-logs\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676424 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twcwm\" (UniqueName: \"kubernetes.io/projected/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-kube-api-access-twcwm\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676457 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-combined-ca-bundle\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676474 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj629\" (UniqueName: \"kubernetes.io/projected/f92df023-2e4a-495e-bbef-4a043c661f46-kube-api-access-qj629\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676831 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-combined-ca-bundle\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676877 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676898 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-config-data-custom\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676916 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-config\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676938 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.677006 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-config-data\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.680491 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92df023-2e4a-495e-bbef-4a043c661f46-logs\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.680841 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.681468 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.681804 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41039943-96a7-4fe6-8b66-0d64cd12a1fa-logs\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.682436 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-config\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.683208 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.687387 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-combined-ca-bundle\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.688193 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-config-data\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.689188 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-config-data\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.689277 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-combined-ca-bundle\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.690771 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-config-data-custom\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.695840 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-config-data-custom\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.697688 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tldtl\" (UniqueName: \"kubernetes.io/projected/41039943-96a7-4fe6-8b66-0d64cd12a1fa-kube-api-access-tldtl\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.704216 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj629\" (UniqueName: \"kubernetes.io/projected/f92df023-2e4a-495e-bbef-4a043c661f46-kube-api-access-qj629\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.704393 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twcwm\" (UniqueName: \"kubernetes.io/projected/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-kube-api-access-twcwm\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.779206 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data-custom\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.779277 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49f24c54-4f24-4f97-a01a-04640bf67b0f-logs\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.779404 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-combined-ca-bundle\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.779431 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d79w\" (UniqueName: \"kubernetes.io/projected/49f24c54-4f24-4f97-a01a-04640bf67b0f-kube-api-access-5d79w\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.779511 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.812600 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.853092 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.860717 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.892314 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.892433 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data-custom\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.892489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49f24c54-4f24-4f97-a01a-04640bf67b0f-logs\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.892763 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-combined-ca-bundle\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.892821 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d79w\" (UniqueName: \"kubernetes.io/projected/49f24c54-4f24-4f97-a01a-04640bf67b0f-kube-api-access-5d79w\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.894976 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49f24c54-4f24-4f97-a01a-04640bf67b0f-logs\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.898228 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.903048 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data-custom\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.920027 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-combined-ca-bundle\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.924280 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d79w\" (UniqueName: \"kubernetes.io/projected/49f24c54-4f24-4f97-a01a-04640bf67b0f-kube-api-access-5d79w\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.927190 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.971959 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.010486 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-swift-storage-0\") pod \"555d083f-48ec-4cf2-922f-211c99af51be\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.010573 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-nb\") pod \"555d083f-48ec-4cf2-922f-211c99af51be\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.010656 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-svc\") pod \"555d083f-48ec-4cf2-922f-211c99af51be\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.010734 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-config\") pod \"555d083f-48ec-4cf2-922f-211c99af51be\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.010843 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxl4f\" (UniqueName: \"kubernetes.io/projected/555d083f-48ec-4cf2-922f-211c99af51be-kube-api-access-hxl4f\") pod \"555d083f-48ec-4cf2-922f-211c99af51be\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.011038 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-sb\") pod \"555d083f-48ec-4cf2-922f-211c99af51be\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.037079 4781 generic.go:334] "Generic (PLEG): container finished" podID="85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f" containerID="69c8105b8323ade72457bd14497f65a1085bea33148e01ee2cbcfb2de3687cdf" exitCode=0 Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.037174 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f","Type":"ContainerDied","Data":"69c8105b8323ade72457bd14497f65a1085bea33148e01ee2cbcfb2de3687cdf"} Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.057436 4781 generic.go:334] "Generic (PLEG): container finished" podID="555d083f-48ec-4cf2-922f-211c99af51be" containerID="4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2" exitCode=0 Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.057619 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.057819 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" event={"ID":"555d083f-48ec-4cf2-922f-211c99af51be","Type":"ContainerDied","Data":"4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2"} Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.057897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" event={"ID":"555d083f-48ec-4cf2-922f-211c99af51be","Type":"ContainerDied","Data":"e585d85b515ebdb2e3ddbc1e6c665f7d63c4b4ae71b72a576899c9774906e6b5"} Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.057916 4781 scope.go:117] "RemoveContainer" containerID="4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.079018 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555d083f-48ec-4cf2-922f-211c99af51be-kube-api-access-hxl4f" (OuterVolumeSpecName: "kube-api-access-hxl4f") pod "555d083f-48ec-4cf2-922f-211c99af51be" (UID: "555d083f-48ec-4cf2-922f-211c99af51be"). InnerVolumeSpecName "kube-api-access-hxl4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.125388 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxl4f\" (UniqueName: \"kubernetes.io/projected/555d083f-48ec-4cf2-922f-211c99af51be-kube-api-access-hxl4f\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.144570 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb47b6b2-760a-4899-84f6-fdf1bd62a418","Type":"ContainerStarted","Data":"fd234a650b390b48c2c62ec04eb6c4e5afa5d6f4b0db395429958fa19cde51f2"} Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.156225 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c479bbf8-lkpd7" event={"ID":"33c297e1-af3e-46d6-9738-8e6833deaf02","Type":"ContainerStarted","Data":"412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8"} Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.156265 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c479bbf8-lkpd7" event={"ID":"33c297e1-af3e-46d6-9738-8e6833deaf02","Type":"ContainerStarted","Data":"21b15cb407945a01adc26829ab99f15cd9c656e66d81cf610b3118b8b9526261"} Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.164215 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56459cf68c-4q7c8"] Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.165834 4781 scope.go:117] "RemoveContainer" containerID="bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.204185 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.204164589 podStartE2EDuration="5.204164589s" podCreationTimestamp="2026-02-27 00:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:34.202671439 +0000 UTC m=+1323.460210993" watchObservedRunningTime="2026-02-27 00:27:34.204164589 +0000 UTC m=+1323.461704143" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.283957 4781 scope.go:117] "RemoveContainer" containerID="4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2" Feb 27 00:27:34 crc kubenswrapper[4781]: E0227 00:27:34.286586 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2\": container with ID starting with 4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2 not found: ID does not exist" containerID="4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.286695 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2"} err="failed to get container status \"4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2\": rpc error: code = NotFound desc = could not find container \"4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2\": container with ID starting with 4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2 not found: ID does not exist" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.286721 4781 scope.go:117] "RemoveContainer" containerID="bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15" Feb 27 00:27:34 crc kubenswrapper[4781]: E0227 00:27:34.287876 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15\": container with ID starting with bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15 not found: ID does not exist" containerID="bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.287909 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15"} err="failed to get container status \"bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15\": rpc error: code = NotFound desc = could not find container \"bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15\": container with ID starting with bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15 not found: ID does not exist" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.445463 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "555d083f-48ec-4cf2-922f-211c99af51be" (UID: "555d083f-48ec-4cf2-922f-211c99af51be"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.494069 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "555d083f-48ec-4cf2-922f-211c99af51be" (UID: "555d083f-48ec-4cf2-922f-211c99af51be"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.494752 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-config" (OuterVolumeSpecName: "config") pod "555d083f-48ec-4cf2-922f-211c99af51be" (UID: "555d083f-48ec-4cf2-922f-211c99af51be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.509589 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "555d083f-48ec-4cf2-922f-211c99af51be" (UID: "555d083f-48ec-4cf2-922f-211c99af51be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.509781 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "555d083f-48ec-4cf2-922f-211c99af51be" (UID: "555d083f-48ec-4cf2-922f-211c99af51be"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.536151 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.536452 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.536462 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.536471 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.536482 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.729702 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5d6jk"] Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.746602 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5d6jk"] Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.779001 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6fff4854c8-ttzsm"] Feb 27 00:27:34 crc kubenswrapper[4781]: W0227 00:27:34.788117 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41039943_96a7_4fe6_8b66_0d64cd12a1fa.slice/crio-8435f854a42c84590af86fd91c77e61b452c0ad591c3cb3add91ab407c060fca WatchSource:0}: Error finding container 8435f854a42c84590af86fd91c77e61b452c0ad591c3cb3add91ab407c060fca: Status 404 returned error can't find the container with id 8435f854a42c84590af86fd91c77e61b452c0ad591c3cb3add91ab407c060fca Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.798007 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7dd7c6f4ff-m4d2l"] Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.890284 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.946980 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-combined-ca-bundle\") pod \"aef65495-ecb2-4396-bb05-a4c5ee48f291\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.947046 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-config-data\") pod \"aef65495-ecb2-4396-bb05-a4c5ee48f291\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.947069 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-scripts\") pod \"aef65495-ecb2-4396-bb05-a4c5ee48f291\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.947174 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aef65495-ecb2-4396-bb05-a4c5ee48f291-etc-machine-id\") pod \"aef65495-ecb2-4396-bb05-a4c5ee48f291\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.950049 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-db-sync-config-data\") pod \"aef65495-ecb2-4396-bb05-a4c5ee48f291\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.950148 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvkqc\" (UniqueName: \"kubernetes.io/projected/aef65495-ecb2-4396-bb05-a4c5ee48f291-kube-api-access-tvkqc\") pod \"aef65495-ecb2-4396-bb05-a4c5ee48f291\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.954219 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aef65495-ecb2-4396-bb05-a4c5ee48f291-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "aef65495-ecb2-4396-bb05-a4c5ee48f291" (UID: "aef65495-ecb2-4396-bb05-a4c5ee48f291"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.960501 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "aef65495-ecb2-4396-bb05-a4c5ee48f291" (UID: "aef65495-ecb2-4396-bb05-a4c5ee48f291"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.960713 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-scripts" (OuterVolumeSpecName: "scripts") pod "aef65495-ecb2-4396-bb05-a4c5ee48f291" (UID: "aef65495-ecb2-4396-bb05-a4c5ee48f291"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.969955 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef65495-ecb2-4396-bb05-a4c5ee48f291-kube-api-access-tvkqc" (OuterVolumeSpecName: "kube-api-access-tvkqc") pod "aef65495-ecb2-4396-bb05-a4c5ee48f291" (UID: "aef65495-ecb2-4396-bb05-a4c5ee48f291"). InnerVolumeSpecName "kube-api-access-tvkqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.037077 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aef65495-ecb2-4396-bb05-a4c5ee48f291" (UID: "aef65495-ecb2-4396-bb05-a4c5ee48f291"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.045938 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m576l"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.046263 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-config-data" (OuterVolumeSpecName: "config-data") pod "aef65495-ecb2-4396-bb05-a4c5ee48f291" (UID: "aef65495-ecb2-4396-bb05-a4c5ee48f291"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:35 crc kubenswrapper[4781]: W0227 00:27:35.051426 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49f24c54_4f24_4f97_a01a_04640bf67b0f.slice/crio-abf074d9baa2f3d6e8969094139a58da187066e40f9840d7df7ac1542a6fb7f6 WatchSource:0}: Error finding container abf074d9baa2f3d6e8969094139a58da187066e40f9840d7df7ac1542a6fb7f6: Status 404 returned error can't find the container with id abf074d9baa2f3d6e8969094139a58da187066e40f9840d7df7ac1542a6fb7f6 Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.055607 4781 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.055662 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvkqc\" (UniqueName: \"kubernetes.io/projected/aef65495-ecb2-4396-bb05-a4c5ee48f291-kube-api-access-tvkqc\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.055677 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.055690 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.055700 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.055711 4781 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aef65495-ecb2-4396-bb05-a4c5ee48f291-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.056239 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fbbfd856b-vgvjg"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.176137 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m576l" event={"ID":"cda4fb4c-7510-49d2-b7bb-2a61c669bacd","Type":"ContainerStarted","Data":"754e7671d1990c27612d0957bd563a0b4f17011e98b48fda1600802520e76182"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.178437 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56459cf68c-4q7c8" event={"ID":"2467458a-476f-460f-a6ce-144d7304476d","Type":"ContainerStarted","Data":"ff6f7d298294bd4b03eba521b7541e4de877f502be33db785bbf690ed8409bf3"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.178483 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56459cf68c-4q7c8" event={"ID":"2467458a-476f-460f-a6ce-144d7304476d","Type":"ContainerStarted","Data":"024f4985a838887352d7012c1a10f2d744a1369f0eeee5b41465456991d59d79"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.178574 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.183870 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" event={"ID":"f92df023-2e4a-495e-bbef-4a043c661f46","Type":"ContainerStarted","Data":"d556255b7fcd41dba0cdb3d082ddcb7ae02f6232b565d8f11f4719ce58109ffc"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.201072 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f","Type":"ContainerStarted","Data":"45b721ee40c522c5e8c9429e4acd1b60e74a7dda2c29b7520c9f497aec09c91f"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.220180 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-56459cf68c-4q7c8" podStartSLOduration=3.2201613 podStartE2EDuration="3.2201613s" podCreationTimestamp="2026-02-27 00:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:35.216432092 +0000 UTC m=+1324.473971656" watchObservedRunningTime="2026-02-27 00:27:35.2201613 +0000 UTC m=+1324.477700854" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.229409 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9vlp4" event={"ID":"aef65495-ecb2-4396-bb05-a4c5ee48f291","Type":"ContainerDied","Data":"77049757ad8c9d1e53f2546542f34ddf95b52b836b4034f26af7417bb129d6d8"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.229452 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77049757ad8c9d1e53f2546542f34ddf95b52b836b4034f26af7417bb129d6d8" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.229417 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.239692 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbbfd856b-vgvjg" event={"ID":"49f24c54-4f24-4f97-a01a-04640bf67b0f","Type":"ContainerStarted","Data":"abf074d9baa2f3d6e8969094139a58da187066e40f9840d7df7ac1542a6fb7f6"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.247376 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" event={"ID":"41039943-96a7-4fe6-8b66-0d64cd12a1fa","Type":"ContainerStarted","Data":"8435f854a42c84590af86fd91c77e61b452c0ad591c3cb3add91ab407c060fca"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.252877 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c479bbf8-lkpd7" event={"ID":"33c297e1-af3e-46d6-9738-8e6833deaf02","Type":"ContainerStarted","Data":"6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.252934 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.252952 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.262686 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:35 crc kubenswrapper[4781]: E0227 00:27:35.263678 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555d083f-48ec-4cf2-922f-211c99af51be" containerName="init" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.263703 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="555d083f-48ec-4cf2-922f-211c99af51be" containerName="init" Feb 27 00:27:35 crc kubenswrapper[4781]: E0227 00:27:35.263736 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555d083f-48ec-4cf2-922f-211c99af51be" containerName="dnsmasq-dns" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.263746 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="555d083f-48ec-4cf2-922f-211c99af51be" containerName="dnsmasq-dns" Feb 27 00:27:35 crc kubenswrapper[4781]: E0227 00:27:35.263761 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef65495-ecb2-4396-bb05-a4c5ee48f291" containerName="cinder-db-sync" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.263769 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef65495-ecb2-4396-bb05-a4c5ee48f291" containerName="cinder-db-sync" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.264024 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="aef65495-ecb2-4396-bb05-a4c5ee48f291" containerName="cinder-db-sync" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.264048 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="555d083f-48ec-4cf2-922f-211c99af51be" containerName="dnsmasq-dns" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.269879 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.278767 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.278883 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.278909 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5hsdr" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.279017 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.296110 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-76c479bbf8-lkpd7" podStartSLOduration=3.296094016 podStartE2EDuration="3.296094016s" podCreationTimestamp="2026-02-27 00:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:35.279601701 +0000 UTC m=+1324.537141265" watchObservedRunningTime="2026-02-27 00:27:35.296094016 +0000 UTC m=+1324.553633560" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.296516 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.352516 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="555d083f-48ec-4cf2-922f-211c99af51be" path="/var/lib/kubelet/pods/555d083f-48ec-4cf2-922f-211c99af51be/volumes" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.353670 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m576l"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.398809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-scripts\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.398866 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.398900 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.399002 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.399025 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87l2c\" (UniqueName: \"kubernetes.io/projected/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-kube-api-access-87l2c\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.399125 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.429960 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-btbp6"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.438644 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.467229 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-btbp6"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501125 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501182 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501201 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-config\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501233 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w96d\" (UniqueName: \"kubernetes.io/projected/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-kube-api-access-4w96d\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501262 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-scripts\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501289 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501315 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501332 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501354 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501384 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501429 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501443 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87l2c\" (UniqueName: \"kubernetes.io/projected/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-kube-api-access-87l2c\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.504250 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.568557 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.571059 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.574925 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.606132 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.606873 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.606917 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.606956 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.607116 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.607136 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-config\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.607174 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w96d\" (UniqueName: \"kubernetes.io/projected/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-kube-api-access-4w96d\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.688588 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.688621 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.689108 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.689175 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-scripts\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.689582 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87l2c\" (UniqueName: \"kubernetes.io/projected/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-kube-api-access-87l2c\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.689948 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.690055 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.690357 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.690677 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-config\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.698034 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.710748 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-scripts\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.710831 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-logs\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.710861 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.710885 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.711064 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5fsl\" (UniqueName: \"kubernetes.io/projected/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-kube-api-access-g5fsl\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.711297 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.711342 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.711993 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w96d\" (UniqueName: \"kubernetes.io/projected/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-kube-api-access-4w96d\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813452 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-logs\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813508 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813533 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813674 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5fsl\" (UniqueName: \"kubernetes.io/projected/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-kube-api-access-g5fsl\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813757 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813782 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813835 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-scripts\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813955 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.833527 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-logs\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.838420 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.838508 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.838650 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-scripts\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.839037 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5fsl\" (UniqueName: \"kubernetes.io/projected/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-kube-api-access-g5fsl\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.844598 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.893085 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.927045 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.953506 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.294753 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbbfd856b-vgvjg" event={"ID":"49f24c54-4f24-4f97-a01a-04640bf67b0f","Type":"ContainerStarted","Data":"0f274fbd09031e3b8e38174b2cbe52a6c6f5f24b60283aea8a0e1a01875fd8b1"} Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.295007 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbbfd856b-vgvjg" event={"ID":"49f24c54-4f24-4f97-a01a-04640bf67b0f","Type":"ContainerStarted","Data":"ee82fae8a491ded998bd0190a2bb94c2ff762013e3316811c1d1f983f5c06787"} Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.298240 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.298339 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.303577 4781 generic.go:334] "Generic (PLEG): container finished" podID="cda4fb4c-7510-49d2-b7bb-2a61c669bacd" containerID="855ac7a49dcfb27210a6b4627deec5ef2b8dada97c06c16142807b4ec54a5193" exitCode=0 Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.304656 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m576l" event={"ID":"cda4fb4c-7510-49d2-b7bb-2a61c669bacd","Type":"ContainerDied","Data":"855ac7a49dcfb27210a6b4627deec5ef2b8dada97c06c16142807b4ec54a5193"} Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.365175 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5fbbfd856b-vgvjg" podStartSLOduration=3.365151739 podStartE2EDuration="3.365151739s" podCreationTimestamp="2026-02-27 00:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:36.335240468 +0000 UTC m=+1325.592780022" watchObservedRunningTime="2026-02-27 00:27:36.365151739 +0000 UTC m=+1325.622691293" Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.533421 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-btbp6"] Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.643564 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.819445 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:38 crc kubenswrapper[4781]: W0227 00:27:38.968424 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9e8c990_7ee9_4f45_91cd_3b49bffbe639.slice/crio-046131775b2694d2a879762121c97dfa31aa0232cc2f2fd28fe9bc50e7ab867a WatchSource:0}: Error finding container 046131775b2694d2a879762121c97dfa31aa0232cc2f2fd28fe9bc50e7ab867a: Status 404 returned error can't find the container with id 046131775b2694d2a879762121c97dfa31aa0232cc2f2fd28fe9bc50e7ab867a Feb 27 00:27:39 crc kubenswrapper[4781]: W0227 00:27:39.005592 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4ba4e2f_1bdf_4f98_a4f6_16c12df07d27.slice/crio-7688a82f556cb3de2b0b7afd752b5b83e46b7cf5458a44e7fd021d53d5d1b140 WatchSource:0}: Error finding container 7688a82f556cb3de2b0b7afd752b5b83e46b7cf5458a44e7fd021d53d5d1b140: Status 404 returned error can't find the container with id 7688a82f556cb3de2b0b7afd752b5b83e46b7cf5458a44e7fd021d53d5d1b140 Feb 27 00:27:39 crc kubenswrapper[4781]: W0227 00:27:39.013617 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2a90e98_bb9f_436d_9a1c_8aebd91000e3.slice/crio-f5601a008ad9454c1a7af70c2d0c5712b2a38f8540f6108d4eb74d5c92b8bcd7 WatchSource:0}: Error finding container f5601a008ad9454c1a7af70c2d0c5712b2a38f8540f6108d4eb74d5c92b8bcd7: Status 404 returned error can't find the container with id f5601a008ad9454c1a7af70c2d0c5712b2a38f8540f6108d4eb74d5c92b8bcd7 Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.212061 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.286781 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-config\") pod \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.286868 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-swift-storage-0\") pod \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.286895 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-svc\") pod \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.287000 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twcwm\" (UniqueName: \"kubernetes.io/projected/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-kube-api-access-twcwm\") pod \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.287035 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-sb\") pod \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.287176 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-nb\") pod \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.302662 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-kube-api-access-twcwm" (OuterVolumeSpecName: "kube-api-access-twcwm") pod "cda4fb4c-7510-49d2-b7bb-2a61c669bacd" (UID: "cda4fb4c-7510-49d2-b7bb-2a61c669bacd"). InnerVolumeSpecName "kube-api-access-twcwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.366263 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" event={"ID":"f2a90e98-bb9f-436d-9a1c-8aebd91000e3","Type":"ContainerStarted","Data":"f5601a008ad9454c1a7af70c2d0c5712b2a38f8540f6108d4eb74d5c92b8bcd7"} Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.367814 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27","Type":"ContainerStarted","Data":"7688a82f556cb3de2b0b7afd752b5b83e46b7cf5458a44e7fd021d53d5d1b140"} Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.371004 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9e8c990-7ee9-4f45-91cd-3b49bffbe639","Type":"ContainerStarted","Data":"046131775b2694d2a879762121c97dfa31aa0232cc2f2fd28fe9bc50e7ab867a"} Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.378129 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m576l" event={"ID":"cda4fb4c-7510-49d2-b7bb-2a61c669bacd","Type":"ContainerDied","Data":"754e7671d1990c27612d0957bd563a0b4f17011e98b48fda1600802520e76182"} Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.378153 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.378181 4781 scope.go:117] "RemoveContainer" containerID="855ac7a49dcfb27210a6b4627deec5ef2b8dada97c06c16142807b4ec54a5193" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.389749 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f","Type":"ContainerStarted","Data":"7c199541b9c842fbd78de05b7b58ee7fd9ba33f171300f536207f3f7cedd9d3e"} Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.391973 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twcwm\" (UniqueName: \"kubernetes.io/projected/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-kube-api-access-twcwm\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.466808 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cda4fb4c-7510-49d2-b7bb-2a61c669bacd" (UID: "cda4fb4c-7510-49d2-b7bb-2a61c669bacd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.473250 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cda4fb4c-7510-49d2-b7bb-2a61c669bacd" (UID: "cda4fb4c-7510-49d2-b7bb-2a61c669bacd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.478948 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-config" (OuterVolumeSpecName: "config") pod "cda4fb4c-7510-49d2-b7bb-2a61c669bacd" (UID: "cda4fb4c-7510-49d2-b7bb-2a61c669bacd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.485376 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cda4fb4c-7510-49d2-b7bb-2a61c669bacd" (UID: "cda4fb4c-7510-49d2-b7bb-2a61c669bacd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.494219 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.494250 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.494260 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.494292 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.495144 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cda4fb4c-7510-49d2-b7bb-2a61c669bacd" (UID: "cda4fb4c-7510-49d2-b7bb-2a61c669bacd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.596422 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.726339 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.752249 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m576l"] Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.773307 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m576l"] Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.356952 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.357004 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.398442 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-9fcdb6594-94vkn"] Feb 27 00:27:40 crc kubenswrapper[4781]: E0227 00:27:40.398903 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda4fb4c-7510-49d2-b7bb-2a61c669bacd" containerName="init" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.398919 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda4fb4c-7510-49d2-b7bb-2a61c669bacd" containerName="init" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.399102 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cda4fb4c-7510-49d2-b7bb-2a61c669bacd" containerName="init" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.400240 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.404833 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.405039 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.426150 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" event={"ID":"41039943-96a7-4fe6-8b66-0d64cd12a1fa","Type":"ContainerStarted","Data":"4111b413c8b691f7a54045bb36452f81012bdf07d4cada4fdae2d7b3ddfe3237"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.426186 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" event={"ID":"41039943-96a7-4fe6-8b66-0d64cd12a1fa","Type":"ContainerStarted","Data":"af5803adc1a48a1ca69fb2089b5d654625ecfb4fd451473c8e9a3e463b8767d4"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.434427 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" event={"ID":"f92df023-2e4a-495e-bbef-4a043c661f46","Type":"ContainerStarted","Data":"b9d473355f3d57f0e6e5327867c207fe882474bb2e3fa96e04a968bddf05b4e9"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.434519 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" event={"ID":"f92df023-2e4a-495e-bbef-4a043c661f46","Type":"ContainerStarted","Data":"c375d6b34e7ff931edc33e6ab35ecf11dbe7b62ea5e8fd59cb9e0c69680c4757"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.440543 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9fcdb6594-94vkn"] Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.459555 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" podStartSLOduration=3.212790004 podStartE2EDuration="7.459533087s" podCreationTimestamp="2026-02-27 00:27:33 +0000 UTC" firstStartedPulling="2026-02-27 00:27:34.810711773 +0000 UTC m=+1324.068251327" lastFinishedPulling="2026-02-27 00:27:39.057454856 +0000 UTC m=+1328.314994410" observedRunningTime="2026-02-27 00:27:40.440821112 +0000 UTC m=+1329.698360666" watchObservedRunningTime="2026-02-27 00:27:40.459533087 +0000 UTC m=+1329.717072641" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.478525 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.484046 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f","Type":"ContainerStarted","Data":"b28843ddc0eeedac24aea963235ae5c3e5d9e83cd06600a666e30355e28fcc9b"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.488340 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" podStartSLOduration=3.24144135 podStartE2EDuration="7.488317447s" podCreationTimestamp="2026-02-27 00:27:33 +0000 UTC" firstStartedPulling="2026-02-27 00:27:34.81060948 +0000 UTC m=+1324.068149034" lastFinishedPulling="2026-02-27 00:27:39.057485577 +0000 UTC m=+1328.315025131" observedRunningTime="2026-02-27 00:27:40.465999748 +0000 UTC m=+1329.723539302" watchObservedRunningTime="2026-02-27 00:27:40.488317447 +0000 UTC m=+1329.745857001" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.494119 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.502582 4781 generic.go:334] "Generic (PLEG): container finished" podID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerID="5c6246746a3c78078a59adb64a2979be72d82f5cfd95c152a4db993cadaf1efe" exitCode=0 Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.502701 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" event={"ID":"f2a90e98-bb9f-436d-9a1c-8aebd91000e3","Type":"ContainerDied","Data":"5c6246746a3c78078a59adb64a2979be72d82f5cfd95c152a4db993cadaf1efe"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.507752 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27","Type":"ContainerStarted","Data":"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522298 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-public-tls-certs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522353 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-internal-tls-certs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522394 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582fee51-d9df-4150-b217-889f2f4f8852-logs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522441 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-combined-ca-bundle\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522682 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7qcq\" (UniqueName: \"kubernetes.io/projected/582fee51-d9df-4150-b217-889f2f4f8852-kube-api-access-n7qcq\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522724 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-config-data-custom\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522819 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-config-data\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.545165 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-l9w6z" event={"ID":"2274af64-0743-4ede-8fb8-e2ed801638ac","Type":"ContainerStarted","Data":"6964fd56259850480217527d40244a043795966342292bb5a943a33534e5489f"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.546409 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.546506 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.563063 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=34.563047782 podStartE2EDuration="34.563047782s" podCreationTimestamp="2026-02-27 00:27:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:40.5569343 +0000 UTC m=+1329.814473874" watchObservedRunningTime="2026-02-27 00:27:40.563047782 +0000 UTC m=+1329.820587336" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.625386 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-l9w6z" podStartSLOduration=3.890576528 podStartE2EDuration="1m2.625367298s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="2026-02-27 00:26:40.793529045 +0000 UTC m=+1270.051068599" lastFinishedPulling="2026-02-27 00:27:39.528319815 +0000 UTC m=+1328.785859369" observedRunningTime="2026-02-27 00:27:40.619886353 +0000 UTC m=+1329.877425907" watchObservedRunningTime="2026-02-27 00:27:40.625367298 +0000 UTC m=+1329.882906852" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.625961 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7qcq\" (UniqueName: \"kubernetes.io/projected/582fee51-d9df-4150-b217-889f2f4f8852-kube-api-access-n7qcq\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.626021 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-config-data-custom\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.626089 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-config-data\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.626144 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-public-tls-certs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.626172 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-internal-tls-certs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.626226 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582fee51-d9df-4150-b217-889f2f4f8852-logs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.626285 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-combined-ca-bundle\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.630073 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582fee51-d9df-4150-b217-889f2f4f8852-logs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.634912 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-config-data\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.635414 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-internal-tls-certs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.636448 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-config-data-custom\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.637221 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-public-tls-certs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.641511 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-combined-ca-bundle\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.649942 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7qcq\" (UniqueName: \"kubernetes.io/projected/582fee51-d9df-4150-b217-889f2f4f8852-kube-api-access-n7qcq\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.736191 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.337393 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cda4fb4c-7510-49d2-b7bb-2a61c669bacd" path="/var/lib/kubelet/pods/cda4fb4c-7510-49d2-b7bb-2a61c669bacd/volumes" Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.402057 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9fcdb6594-94vkn"] Feb 27 00:27:41 crc kubenswrapper[4781]: W0227 00:27:41.420798 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod582fee51_d9df_4150_b217_889f2f4f8852.slice/crio-12d9d33c6a9d179d79a7973dee43ecbdf0803cff1fa39e9e98b5a1c252b40784 WatchSource:0}: Error finding container 12d9d33c6a9d179d79a7973dee43ecbdf0803cff1fa39e9e98b5a1c252b40784: Status 404 returned error can't find the container with id 12d9d33c6a9d179d79a7973dee43ecbdf0803cff1fa39e9e98b5a1c252b40784 Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.593099 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9fcdb6594-94vkn" event={"ID":"582fee51-d9df-4150-b217-889f2f4f8852","Type":"ContainerStarted","Data":"12d9d33c6a9d179d79a7973dee43ecbdf0803cff1fa39e9e98b5a1c252b40784"} Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.596580 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27","Type":"ContainerStarted","Data":"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078"} Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.596756 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api-log" containerID="cri-o://29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280" gracePeriod=30 Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.597762 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.597845 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api" containerID="cri-o://763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078" gracePeriod=30 Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.610546 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9e8c990-7ee9-4f45-91cd-3b49bffbe639","Type":"ContainerStarted","Data":"41d9cebd94820355907237ef05a02051691148a4c837c57f7086e2f710d9721d"} Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.614682 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" event={"ID":"f2a90e98-bb9f-436d-9a1c-8aebd91000e3","Type":"ContainerStarted","Data":"8eb943556508c5cc9103fa044300406224b9b4973d8e501d8f7538f1c3573e24"} Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.614722 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.641184 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.641165664 podStartE2EDuration="6.641165664s" podCreationTimestamp="2026-02-27 00:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:41.637962519 +0000 UTC m=+1330.895502083" watchObservedRunningTime="2026-02-27 00:27:41.641165664 +0000 UTC m=+1330.898705218" Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.665352 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" podStartSLOduration=6.665318962 podStartE2EDuration="6.665318962s" podCreationTimestamp="2026-02-27 00:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:41.659474977 +0000 UTC m=+1330.917014871" watchObservedRunningTime="2026-02-27 00:27:41.665318962 +0000 UTC m=+1330.922858516" Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.969991 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.628007 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.637874 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9fcdb6594-94vkn" event={"ID":"582fee51-d9df-4150-b217-889f2f4f8852","Type":"ContainerStarted","Data":"64905d7f8814a6f41685585d47354bca5f1dd631fafcd1e8f96fea6ccb13b368"} Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.637920 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9fcdb6594-94vkn" event={"ID":"582fee51-d9df-4150-b217-889f2f4f8852","Type":"ContainerStarted","Data":"7ae906019bba9d2b84dbd14808c0e069fccf6224d243961e67e2f805a0b64d72"} Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.638828 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.638859 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656020 4781 generic.go:334] "Generic (PLEG): container finished" podID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerID="763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078" exitCode=0 Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656050 4781 generic.go:334] "Generic (PLEG): container finished" podID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerID="29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280" exitCode=143 Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656091 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27","Type":"ContainerDied","Data":"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078"} Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656121 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27","Type":"ContainerDied","Data":"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280"} Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656131 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27","Type":"ContainerDied","Data":"7688a82f556cb3de2b0b7afd752b5b83e46b7cf5458a44e7fd021d53d5d1b140"} Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656145 4781 scope.go:117] "RemoveContainer" containerID="763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656266 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.662816 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9e8c990-7ee9-4f45-91cd-3b49bffbe639","Type":"ContainerStarted","Data":"b553522f28c5c9228f18ef94c90af74eec79bd23cb9493d672e1a7c999be2dde"} Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.663349 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.663366 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.706806 4781 scope.go:117] "RemoveContainer" containerID="29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.708472 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-9fcdb6594-94vkn" podStartSLOduration=2.70844983 podStartE2EDuration="2.70844983s" podCreationTimestamp="2026-02-27 00:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:42.6955728 +0000 UTC m=+1331.953112354" watchObservedRunningTime="2026-02-27 00:27:42.70844983 +0000 UTC m=+1331.965989384" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.736050 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.794091965 podStartE2EDuration="7.736034079s" podCreationTimestamp="2026-02-27 00:27:35 +0000 UTC" firstStartedPulling="2026-02-27 00:27:38.998260073 +0000 UTC m=+1328.255799627" lastFinishedPulling="2026-02-27 00:27:39.940202197 +0000 UTC m=+1329.197741741" observedRunningTime="2026-02-27 00:27:42.722311077 +0000 UTC m=+1331.979850651" watchObservedRunningTime="2026-02-27 00:27:42.736034079 +0000 UTC m=+1331.993573633" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.745365 4781 scope.go:117] "RemoveContainer" containerID="763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078" Feb 27 00:27:42 crc kubenswrapper[4781]: E0227 00:27:42.748323 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078\": container with ID starting with 763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078 not found: ID does not exist" containerID="763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.748362 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078"} err="failed to get container status \"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078\": rpc error: code = NotFound desc = could not find container \"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078\": container with ID starting with 763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078 not found: ID does not exist" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.748384 4781 scope.go:117] "RemoveContainer" containerID="29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280" Feb 27 00:27:42 crc kubenswrapper[4781]: E0227 00:27:42.748766 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280\": container with ID starting with 29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280 not found: ID does not exist" containerID="29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.748788 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280"} err="failed to get container status \"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280\": rpc error: code = NotFound desc = could not find container \"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280\": container with ID starting with 29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280 not found: ID does not exist" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.748802 4781 scope.go:117] "RemoveContainer" containerID="763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.748991 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078"} err="failed to get container status \"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078\": rpc error: code = NotFound desc = could not find container \"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078\": container with ID starting with 763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078 not found: ID does not exist" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.749005 4781 scope.go:117] "RemoveContainer" containerID="29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.749204 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280"} err="failed to get container status \"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280\": rpc error: code = NotFound desc = could not find container \"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280\": container with ID starting with 29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280 not found: ID does not exist" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786406 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5fsl\" (UniqueName: \"kubernetes.io/projected/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-kube-api-access-g5fsl\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786540 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-combined-ca-bundle\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786563 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data-custom\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786694 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786773 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-logs\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786851 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-etc-machine-id\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786885 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-scripts\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.792755 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-kube-api-access-g5fsl" (OuterVolumeSpecName: "kube-api-access-g5fsl") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "kube-api-access-g5fsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.793085 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.793833 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-logs" (OuterVolumeSpecName: "logs") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.794943 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-scripts" (OuterVolumeSpecName: "scripts") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.795740 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.819936 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.845710 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data" (OuterVolumeSpecName: "config-data") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889494 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889530 4781 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889543 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889551 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5fsl\" (UniqueName: \"kubernetes.io/projected/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-kube-api-access-g5fsl\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889561 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889570 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889578 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.895652 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.895696 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.895738 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.896250 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"40924ce0e5e04646329cd01d3e3dc65fdaf6b21bdd01704d3fa5ed81c86443f6"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.896312 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://40924ce0e5e04646329cd01d3e3dc65fdaf6b21bdd01704d3fa5ed81c86443f6" gracePeriod=600 Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.002106 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.019772 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.028061 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:43 crc kubenswrapper[4781]: E0227 00:27:43.028558 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.028578 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api" Feb 27 00:27:43 crc kubenswrapper[4781]: E0227 00:27:43.028590 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api-log" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.028597 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api-log" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.031867 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api-log" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.031921 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.034078 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.036499 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.037585 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.037783 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.037900 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.167408 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197595 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197673 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb0bf7e-097c-4c30-b0e6-224090588da2-logs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197698 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cb0bf7e-097c-4c30-b0e6-224090588da2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197802 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-scripts\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197843 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcxsc\" (UniqueName: \"kubernetes.io/projected/1cb0bf7e-097c-4c30-b0e6-224090588da2-kube-api-access-pcxsc\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197898 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-config-data\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197920 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197995 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.198040 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-config-data-custom\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.243881 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.264420 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.304725 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcxsc\" (UniqueName: \"kubernetes.io/projected/1cb0bf7e-097c-4c30-b0e6-224090588da2-kube-api-access-pcxsc\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.304792 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-config-data\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.304821 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.304860 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.304898 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-config-data-custom\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.305011 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.305074 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb0bf7e-097c-4c30-b0e6-224090588da2-logs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.305100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cb0bf7e-097c-4c30-b0e6-224090588da2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.305147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-scripts\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.311606 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cb0bf7e-097c-4c30-b0e6-224090588da2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.312426 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.317210 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.340776 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb0bf7e-097c-4c30-b0e6-224090588da2-logs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.347040 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-scripts\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.348519 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-config-data-custom\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.360026 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" path="/var/lib/kubelet/pods/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27/volumes" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.362969 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.396778 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-config-data\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.419249 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcxsc\" (UniqueName: \"kubernetes.io/projected/1cb0bf7e-097c-4c30-b0e6-224090588da2-kube-api-access-pcxsc\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.510426 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b48494fc7-447pr"] Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.510693 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b48494fc7-447pr" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-api" containerID="cri-o://994246fa04a777c2f0ceb85d5b3e476072c41f89030472fc48f602b083a3eada" gracePeriod=30 Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.511374 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b48494fc7-447pr" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-httpd" containerID="cri-o://d7c09d305d22e97d0875bde304e390f511aac9300a440daba221eab217d0ec4d" gracePeriod=30 Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.527844 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56f5d76fc7-rbhdd"] Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.544433 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.549723 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5b48494fc7-447pr" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.176:9696/\": EOF" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.577102 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56f5d76fc7-rbhdd"] Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.648543 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-config\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.648625 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-internal-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.648689 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfrzq\" (UniqueName: \"kubernetes.io/projected/384db6f0-71f1-4926-9e65-5c27eb430325-kube-api-access-pfrzq\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.648721 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-public-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.648743 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-httpd-config\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.648955 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-ovndb-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.649015 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-combined-ca-bundle\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.658672 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.690379 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="40924ce0e5e04646329cd01d3e3dc65fdaf6b21bdd01704d3fa5ed81c86443f6" exitCode=0 Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.691107 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"40924ce0e5e04646329cd01d3e3dc65fdaf6b21bdd01704d3fa5ed81c86443f6"} Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.691175 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"18f81d6f38ae3802e83160171263bed0ca095345d87ab2807429711c0c761818"} Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.691193 4781 scope.go:117] "RemoveContainer" containerID="58cd249b96a5284dbe453e012e30bb3f9acbc9ed9b891c6e44075d418edc5ad9" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.750897 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-config\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.750981 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-internal-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.751019 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfrzq\" (UniqueName: \"kubernetes.io/projected/384db6f0-71f1-4926-9e65-5c27eb430325-kube-api-access-pfrzq\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.751054 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-public-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.751073 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-httpd-config\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.751118 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-ovndb-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.751141 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-combined-ca-bundle\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.754971 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-internal-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.758363 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-ovndb-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.758421 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-config\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.759952 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-httpd-config\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.767678 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfrzq\" (UniqueName: \"kubernetes.io/projected/384db6f0-71f1-4926-9e65-5c27eb430325-kube-api-access-pfrzq\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.770038 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-public-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.770315 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-combined-ca-bundle\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.863409 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:44 crc kubenswrapper[4781]: I0227 00:27:44.227195 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:44 crc kubenswrapper[4781]: I0227 00:27:44.661613 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56f5d76fc7-rbhdd"] Feb 27 00:27:44 crc kubenswrapper[4781]: I0227 00:27:44.717023 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1cb0bf7e-097c-4c30-b0e6-224090588da2","Type":"ContainerStarted","Data":"a3c8473f7c00f6f8d6ad0b5909fd3122ed20edb94776a38bb334741f178ba1b2"} Feb 27 00:27:44 crc kubenswrapper[4781]: I0227 00:27:44.721340 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f5d76fc7-rbhdd" event={"ID":"384db6f0-71f1-4926-9e65-5c27eb430325","Type":"ContainerStarted","Data":"c72f5df6adad889cfddb9f3c11e18af222ff251cf27f930b227141e5f1669d89"} Feb 27 00:27:44 crc kubenswrapper[4781]: I0227 00:27:44.737235 4781 generic.go:334] "Generic (PLEG): container finished" podID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerID="d7c09d305d22e97d0875bde304e390f511aac9300a440daba221eab217d0ec4d" exitCode=0 Feb 27 00:27:44 crc kubenswrapper[4781]: I0227 00:27:44.737302 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b48494fc7-447pr" event={"ID":"2138a247-a569-4ed6-91a9-5dde2a0b5fa9","Type":"ContainerDied","Data":"d7c09d305d22e97d0875bde304e390f511aac9300a440daba221eab217d0ec4d"} Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.029227 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5b48494fc7-447pr" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.176:9696/\": dial tcp 10.217.0.176:9696: connect: connection refused" Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.503858 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.770620 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1cb0bf7e-097c-4c30-b0e6-224090588da2","Type":"ContainerStarted","Data":"c2f402a28d29e4c07e4caf87ce15c5ce23b22ea1ee6fcef0fd84ea91e0276827"} Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.772567 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f5d76fc7-rbhdd" event={"ID":"384db6f0-71f1-4926-9e65-5c27eb430325","Type":"ContainerStarted","Data":"e18093c34186b89284f05d20d70c61677202602e6d87793470b7ec47e6c4f2d1"} Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.772594 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f5d76fc7-rbhdd" event={"ID":"384db6f0-71f1-4926-9e65-5c27eb430325","Type":"ContainerStarted","Data":"e0bdefffb66c07dedeef73f1b0f37060ecf0ce4e0424b4701aba3fd0711d0981"} Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.773992 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.780728 4781 generic.go:334] "Generic (PLEG): container finished" podID="2274af64-0743-4ede-8fb8-e2ed801638ac" containerID="6964fd56259850480217527d40244a043795966342292bb5a943a33534e5489f" exitCode=0 Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.780811 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-l9w6z" event={"ID":"2274af64-0743-4ede-8fb8-e2ed801638ac","Type":"ContainerDied","Data":"6964fd56259850480217527d40244a043795966342292bb5a943a33534e5489f"} Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.802362 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56f5d76fc7-rbhdd" podStartSLOduration=2.8023068650000003 podStartE2EDuration="2.802306865s" podCreationTimestamp="2026-02-27 00:27:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:45.79606681 +0000 UTC m=+1335.053606364" watchObservedRunningTime="2026-02-27 00:27:45.802306865 +0000 UTC m=+1335.059846419" Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.954425 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 27 00:27:46 crc kubenswrapper[4781]: I0227 00:27:46.055499 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:46 crc kubenswrapper[4781]: I0227 00:27:46.271227 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 27 00:27:46 crc kubenswrapper[4781]: I0227 00:27:46.792163 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1cb0bf7e-097c-4c30-b0e6-224090588da2","Type":"ContainerStarted","Data":"186b7702c1547bb9b47df8a7f0efdae4d5d5c863e56c694361ce54ad078e236b"} Feb 27 00:27:46 crc kubenswrapper[4781]: I0227 00:27:46.859572 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:46 crc kubenswrapper[4781]: I0227 00:27:46.863240 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.863220442 podStartE2EDuration="4.863220442s" podCreationTimestamp="2026-02-27 00:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:46.829551443 +0000 UTC m=+1336.087091027" watchObservedRunningTime="2026-02-27 00:27:46.863220442 +0000 UTC m=+1336.120759996" Feb 27 00:27:47 crc kubenswrapper[4781]: I0227 00:27:47.802845 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 27 00:27:47 crc kubenswrapper[4781]: I0227 00:27:47.803222 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="cinder-scheduler" containerID="cri-o://41d9cebd94820355907237ef05a02051691148a4c837c57f7086e2f710d9721d" gracePeriod=30 Feb 27 00:27:47 crc kubenswrapper[4781]: I0227 00:27:47.803647 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="probe" containerID="cri-o://b553522f28c5c9228f18ef94c90af74eec79bd23cb9493d672e1a7c999be2dde" gracePeriod=30 Feb 27 00:27:48 crc kubenswrapper[4781]: I0227 00:27:48.812541 4781 generic.go:334] "Generic (PLEG): container finished" podID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerID="994246fa04a777c2f0ceb85d5b3e476072c41f89030472fc48f602b083a3eada" exitCode=0 Feb 27 00:27:48 crc kubenswrapper[4781]: I0227 00:27:48.812609 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b48494fc7-447pr" event={"ID":"2138a247-a569-4ed6-91a9-5dde2a0b5fa9","Type":"ContainerDied","Data":"994246fa04a777c2f0ceb85d5b3e476072c41f89030472fc48f602b083a3eada"} Feb 27 00:27:48 crc kubenswrapper[4781]: I0227 00:27:48.814589 4781 generic.go:334] "Generic (PLEG): container finished" podID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerID="b553522f28c5c9228f18ef94c90af74eec79bd23cb9493d672e1a7c999be2dde" exitCode=0 Feb 27 00:27:48 crc kubenswrapper[4781]: I0227 00:27:48.814611 4781 generic.go:334] "Generic (PLEG): container finished" podID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerID="41d9cebd94820355907237ef05a02051691148a4c837c57f7086e2f710d9721d" exitCode=0 Feb 27 00:27:48 crc kubenswrapper[4781]: I0227 00:27:48.815527 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9e8c990-7ee9-4f45-91cd-3b49bffbe639","Type":"ContainerDied","Data":"b553522f28c5c9228f18ef94c90af74eec79bd23cb9493d672e1a7c999be2dde"} Feb 27 00:27:48 crc kubenswrapper[4781]: I0227 00:27:48.815553 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9e8c990-7ee9-4f45-91cd-3b49bffbe639","Type":"ContainerDied","Data":"41d9cebd94820355907237ef05a02051691148a4c837c57f7086e2f710d9721d"} Feb 27 00:27:49 crc kubenswrapper[4781]: I0227 00:27:49.830620 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-l9w6z" event={"ID":"2274af64-0743-4ede-8fb8-e2ed801638ac","Type":"ContainerDied","Data":"6d62d5f9e32bc3adf9e5c830b2c7fb23773647380ed0a769526c60e85872b03f"} Feb 27 00:27:49 crc kubenswrapper[4781]: I0227 00:27:49.830890 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d62d5f9e32bc3adf9e5c830b2c7fb23773647380ed0a769526c60e85872b03f" Feb 27 00:27:49 crc kubenswrapper[4781]: I0227 00:27:49.915690 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.000411 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-config-data\") pod \"2274af64-0743-4ede-8fb8-e2ed801638ac\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.000552 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwsv7\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-kube-api-access-bwsv7\") pod \"2274af64-0743-4ede-8fb8-e2ed801638ac\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.000598 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-scripts\") pod \"2274af64-0743-4ede-8fb8-e2ed801638ac\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.000664 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-combined-ca-bundle\") pod \"2274af64-0743-4ede-8fb8-e2ed801638ac\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.000744 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-certs\") pod \"2274af64-0743-4ede-8fb8-e2ed801638ac\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.015865 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-scripts" (OuterVolumeSpecName: "scripts") pod "2274af64-0743-4ede-8fb8-e2ed801638ac" (UID: "2274af64-0743-4ede-8fb8-e2ed801638ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.018779 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-certs" (OuterVolumeSpecName: "certs") pod "2274af64-0743-4ede-8fb8-e2ed801638ac" (UID: "2274af64-0743-4ede-8fb8-e2ed801638ac"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.023169 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-kube-api-access-bwsv7" (OuterVolumeSpecName: "kube-api-access-bwsv7") pod "2274af64-0743-4ede-8fb8-e2ed801638ac" (UID: "2274af64-0743-4ede-8fb8-e2ed801638ac"). InnerVolumeSpecName "kube-api-access-bwsv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.047966 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-config-data" (OuterVolumeSpecName: "config-data") pod "2274af64-0743-4ede-8fb8-e2ed801638ac" (UID: "2274af64-0743-4ede-8fb8-e2ed801638ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.059735 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2274af64-0743-4ede-8fb8-e2ed801638ac" (UID: "2274af64-0743-4ede-8fb8-e2ed801638ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.105585 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwsv7\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-kube-api-access-bwsv7\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.105648 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.105659 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.105677 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.105686 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.572399 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.579825 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722431 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-public-tls-certs\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722764 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-scripts\") pod \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722815 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data-custom\") pod \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722849 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4zg7\" (UniqueName: \"kubernetes.io/projected/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-kube-api-access-h4zg7\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722874 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-internal-tls-certs\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722922 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-httpd-config\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722981 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data\") pod \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.723095 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-combined-ca-bundle\") pod \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.723614 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-ovndb-tls-certs\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.723663 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87l2c\" (UniqueName: \"kubernetes.io/projected/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-kube-api-access-87l2c\") pod \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.723691 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-combined-ca-bundle\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.723733 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-config\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.723802 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-etc-machine-id\") pod \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.724708 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c9e8c990-7ee9-4f45-91cd-3b49bffbe639" (UID: "c9e8c990-7ee9-4f45-91cd-3b49bffbe639"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.726362 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-kube-api-access-h4zg7" (OuterVolumeSpecName: "kube-api-access-h4zg7") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "kube-api-access-h4zg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.735210 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-scripts" (OuterVolumeSpecName: "scripts") pod "c9e8c990-7ee9-4f45-91cd-3b49bffbe639" (UID: "c9e8c990-7ee9-4f45-91cd-3b49bffbe639"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.735877 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c9e8c990-7ee9-4f45-91cd-3b49bffbe639" (UID: "c9e8c990-7ee9-4f45-91cd-3b49bffbe639"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.741786 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-kube-api-access-87l2c" (OuterVolumeSpecName: "kube-api-access-87l2c") pod "c9e8c990-7ee9-4f45-91cd-3b49bffbe639" (UID: "c9e8c990-7ee9-4f45-91cd-3b49bffbe639"). InnerVolumeSpecName "kube-api-access-87l2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.744760 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.826414 4781 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.826446 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.826456 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.826465 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4zg7\" (UniqueName: \"kubernetes.io/projected/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-kube-api-access-h4zg7\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.826476 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.826485 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87l2c\" (UniqueName: \"kubernetes.io/projected/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-kube-api-access-87l2c\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.843062 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.858374 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.858432 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b48494fc7-447pr" event={"ID":"2138a247-a569-4ed6-91a9-5dde2a0b5fa9","Type":"ContainerDied","Data":"8cfc8b26590e03ab4b9d1a7221cd85bef307e38eb533c1221abe3eafc0089adc"} Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.858572 4781 scope.go:117] "RemoveContainer" containerID="d7c09d305d22e97d0875bde304e390f511aac9300a440daba221eab217d0ec4d" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.862268 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.862516 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.862822 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9e8c990-7ee9-4f45-91cd-3b49bffbe639","Type":"ContainerDied","Data":"046131775b2694d2a879762121c97dfa31aa0232cc2f2fd28fe9bc50e7ab867a"} Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.881955 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9e8c990-7ee9-4f45-91cd-3b49bffbe639" (UID: "c9e8c990-7ee9-4f45-91cd-3b49bffbe639"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.895765 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.908168 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.916017 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.929413 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.929465 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.929476 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.929484 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.966873 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-config" (OuterVolumeSpecName: "config") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.977682 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bp4v9"] Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.977919 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerName="dnsmasq-dns" containerID="cri-o://17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf" gracePeriod=10 Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.003884 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.041537 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.075757 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.127720 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-g672n"] Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.128114 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="probe" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128131 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="probe" Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.128141 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-api" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128146 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-api" Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.128163 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="cinder-scheduler" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128170 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="cinder-scheduler" Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.128190 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2274af64-0743-4ede-8fb8-e2ed801638ac" containerName="cloudkitty-db-sync" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128197 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2274af64-0743-4ede-8fb8-e2ed801638ac" containerName="cloudkitty-db-sync" Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.128211 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-httpd" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128217 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-httpd" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128395 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2274af64-0743-4ede-8fb8-e2ed801638ac" containerName="cloudkitty-db-sync" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128408 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-api" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128418 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-httpd" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128428 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="probe" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128448 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="cinder-scheduler" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.134351 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.141055 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.141276 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.141440 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-qt68h" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.141643 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.141759 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.143083 4781 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.151654 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data" (OuterVolumeSpecName: "config-data") pod "c9e8c990-7ee9-4f45-91cd-3b49bffbe639" (UID: "c9e8c990-7ee9-4f45-91cd-3b49bffbe639"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.174697 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-g672n"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.256111 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-scripts\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.256274 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbrfx\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-kube-api-access-zbrfx\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.256316 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-combined-ca-bundle\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.256334 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-certs\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.256463 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-config-data\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.256607 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.287937 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.306331 4781 scope.go:117] "RemoveContainer" containerID="994246fa04a777c2f0ceb85d5b3e476072c41f89030472fc48f602b083a3eada" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.308696 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.340859 4781 scope.go:117] "RemoveContainer" containerID="b553522f28c5c9228f18ef94c90af74eec79bd23cb9493d672e1a7c999be2dde" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.340863 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" path="/var/lib/kubelet/pods/c9e8c990-7ee9-4f45-91cd-3b49bffbe639/volumes" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.341591 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b48494fc7-447pr"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.347759 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5b48494fc7-447pr"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.357759 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.359966 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.361200 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbrfx\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-kube-api-access-zbrfx\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.361265 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-combined-ca-bundle\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.361286 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-certs\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.362065 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.362606 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-config-data\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.362811 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-scripts\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.363785 4781 scope.go:117] "RemoveContainer" containerID="41d9cebd94820355907237ef05a02051691148a4c837c57f7086e2f710d9721d" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.365833 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-combined-ca-bundle\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.366620 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-certs\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.367393 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-scripts\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.367443 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.368985 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-config-data\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.384687 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbrfx\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-kube-api-access-zbrfx\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.464397 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16cb4c6c-2ddb-41e0-8db3-f44961445474-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.464445 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.464463 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-scripts\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.464529 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-config-data\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.465447 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.465473 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9plz\" (UniqueName: \"kubernetes.io/projected/16cb4c6c-2ddb-41e0-8db3-f44961445474-kube-api-access-d9plz\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.490615 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2274af64_0743_4ede_8fb8_e2ed801638ac.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2138a247_a569_4ed6_91a9_5dde2a0b5fa9.slice/crio-8cfc8b26590e03ab4b9d1a7221cd85bef307e38eb533c1221abe3eafc0089adc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd15e642_6664_416f_ac4e_9cddc96e5642.slice/crio-17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2138a247_a569_4ed6_91a9_5dde2a0b5fa9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd15e642_6664_416f_ac4e_9cddc96e5642.slice/crio-conmon-17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2274af64_0743_4ede_8fb8_e2ed801638ac.slice/crio-6d62d5f9e32bc3adf9e5c830b2c7fb23773647380ed0a769526c60e85872b03f\": RecentStats: unable to find data in memory cache]" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.568993 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16cb4c6c-2ddb-41e0-8db3-f44961445474-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.569359 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.569394 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-scripts\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.569491 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-config-data\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.569096 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16cb4c6c-2ddb-41e0-8db3-f44961445474-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.569615 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.569687 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9plz\" (UniqueName: \"kubernetes.io/projected/16cb4c6c-2ddb-41e0-8db3-f44961445474-kube-api-access-d9plz\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.573293 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-scripts\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.573781 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.574175 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-config-data\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.574492 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.577917 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.587794 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9plz\" (UniqueName: \"kubernetes.io/projected/16cb4c6c-2ddb-41e0-8db3-f44961445474-kube-api-access-d9plz\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.589871 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.671552 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxql5\" (UniqueName: \"kubernetes.io/projected/dd15e642-6664-416f-ac4e-9cddc96e5642-kube-api-access-rxql5\") pod \"dd15e642-6664-416f-ac4e-9cddc96e5642\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.671742 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-sb\") pod \"dd15e642-6664-416f-ac4e-9cddc96e5642\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.671777 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-config\") pod \"dd15e642-6664-416f-ac4e-9cddc96e5642\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.671838 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-svc\") pod \"dd15e642-6664-416f-ac4e-9cddc96e5642\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.671993 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-swift-storage-0\") pod \"dd15e642-6664-416f-ac4e-9cddc96e5642\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.672084 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-nb\") pod \"dd15e642-6664-416f-ac4e-9cddc96e5642\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.676650 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd15e642-6664-416f-ac4e-9cddc96e5642-kube-api-access-rxql5" (OuterVolumeSpecName: "kube-api-access-rxql5") pod "dd15e642-6664-416f-ac4e-9cddc96e5642" (UID: "dd15e642-6664-416f-ac4e-9cddc96e5642"). InnerVolumeSpecName "kube-api-access-rxql5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.678146 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.729264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd15e642-6664-416f-ac4e-9cddc96e5642" (UID: "dd15e642-6664-416f-ac4e-9cddc96e5642"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.753125 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dd15e642-6664-416f-ac4e-9cddc96e5642" (UID: "dd15e642-6664-416f-ac4e-9cddc96e5642"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.756669 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-config" (OuterVolumeSpecName: "config") pod "dd15e642-6664-416f-ac4e-9cddc96e5642" (UID: "dd15e642-6664-416f-ac4e-9cddc96e5642"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.771894 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dd15e642-6664-416f-ac4e-9cddc96e5642" (UID: "dd15e642-6664-416f-ac4e-9cddc96e5642"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.774197 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxql5\" (UniqueName: \"kubernetes.io/projected/dd15e642-6664-416f-ac4e-9cddc96e5642-kube-api-access-rxql5\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.774225 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.774234 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.774242 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.774250 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.787326 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dd15e642-6664-416f-ac4e-9cddc96e5642" (UID: "dd15e642-6664-416f-ac4e-9cddc96e5642"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.876406 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.890944 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerStarted","Data":"e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8"} Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.891309 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="ceilometer-notification-agent" containerID="cri-o://a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892" gracePeriod=30 Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.891457 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.891480 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="proxy-httpd" containerID="cri-o://e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8" gracePeriod=30 Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.891542 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="sg-core" containerID="cri-o://b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717" gracePeriod=30 Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.916928 4781 generic.go:334] "Generic (PLEG): container finished" podID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerID="17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf" exitCode=0 Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.916991 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" event={"ID":"dd15e642-6664-416f-ac4e-9cddc96e5642","Type":"ContainerDied","Data":"17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf"} Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.917019 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" event={"ID":"dd15e642-6664-416f-ac4e-9cddc96e5642","Type":"ContainerDied","Data":"9e67430f08589dcb2cfca360edd38ce35b2b7fe28eecbb76ca402ae3e309ab2c"} Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.917035 4781 scope.go:117] "RemoveContainer" containerID="17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.917146 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.965937 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bp4v9"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.970334 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.973455 4781 scope.go:117] "RemoveContainer" containerID="90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.977138 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bp4v9"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.979468 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.997431 4781 scope.go:117] "RemoveContainer" containerID="17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf" Feb 27 00:27:52 crc kubenswrapper[4781]: E0227 00:27:52.017792 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf\": container with ID starting with 17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf not found: ID does not exist" containerID="17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf" Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.017836 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf"} err="failed to get container status \"17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf\": rpc error: code = NotFound desc = could not find container \"17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf\": container with ID starting with 17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf not found: ID does not exist" Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.017863 4781 scope.go:117] "RemoveContainer" containerID="90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5" Feb 27 00:27:52 crc kubenswrapper[4781]: E0227 00:27:52.025178 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5\": container with ID starting with 90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5 not found: ID does not exist" containerID="90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5" Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.026330 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5"} err="failed to get container status \"90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5\": rpc error: code = NotFound desc = could not find container \"90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5\": container with ID starting with 90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5 not found: ID does not exist" Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.165458 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-g672n"] Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.326906 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:52 crc kubenswrapper[4781]: W0227 00:27:52.329091 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16cb4c6c_2ddb_41e0_8db3_f44961445474.slice/crio-b23fcde744d6f232272b068172acc365fc9f6d2c6112861ce3acf5dc8a4f39a5 WatchSource:0}: Error finding container b23fcde744d6f232272b068172acc365fc9f6d2c6112861ce3acf5dc8a4f39a5: Status 404 returned error can't find the container with id b23fcde744d6f232272b068172acc365fc9f6d2c6112861ce3acf5dc8a4f39a5 Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.972363 4781 generic.go:334] "Generic (PLEG): container finished" podID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerID="e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8" exitCode=0 Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.972733 4781 generic.go:334] "Generic (PLEG): container finished" podID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerID="b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717" exitCode=2 Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.972730 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerDied","Data":"e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8"} Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.972794 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerDied","Data":"b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717"} Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.976196 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"16cb4c6c-2ddb-41e0-8db3-f44961445474","Type":"ContainerStarted","Data":"4e0ede3a6498fb18b0850e460c17347bbbbbad2912a10979839ed6832112689b"} Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.976248 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"16cb4c6c-2ddb-41e0-8db3-f44961445474","Type":"ContainerStarted","Data":"b23fcde744d6f232272b068172acc365fc9f6d2c6112861ce3acf5dc8a4f39a5"} Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.977774 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g672n" event={"ID":"87b3198c-30ab-415a-b24b-b26ab3da838e","Type":"ContainerStarted","Data":"7aaaa3159dfec72ce2bfd72718ace0516b0de685b4c75d813a19d16d4226019b"} Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.977805 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g672n" event={"ID":"87b3198c-30ab-415a-b24b-b26ab3da838e","Type":"ContainerStarted","Data":"5ea5e68fe7fb3730a14c055ae47e47a12d2ed4ea16d87ceb507c87aa6e875602"} Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.988954 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.994993 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.036416 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-g672n" podStartSLOduration=2.03639958 podStartE2EDuration="2.03639958s" podCreationTimestamp="2026-02-27 00:27:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:53.000014509 +0000 UTC m=+1342.257554083" watchObservedRunningTime="2026-02-27 00:27:53.03639958 +0000 UTC m=+1342.293939134" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.156275 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.240810 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5fbbfd856b-vgvjg"] Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.241009 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5fbbfd856b-vgvjg" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api-log" containerID="cri-o://ee82fae8a491ded998bd0190a2bb94c2ff762013e3316811c1d1f983f5c06787" gracePeriod=30 Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.242354 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5fbbfd856b-vgvjg" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api" containerID="cri-o://0f274fbd09031e3b8e38174b2cbe52a6c6f5f24b60283aea8a0e1a01875fd8b1" gracePeriod=30 Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.470543 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" path="/var/lib/kubelet/pods/2138a247-a569-4ed6-91a9-5dde2a0b5fa9/volumes" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.471478 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" path="/var/lib/kubelet/pods/dd15e642-6664-416f-ac4e-9cddc96e5642/volumes" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.790284 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894366 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-scripts\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894767 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-run-httpd\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894827 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-combined-ca-bundle\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894890 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qwfg\" (UniqueName: \"kubernetes.io/projected/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-kube-api-access-9qwfg\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894926 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-sg-core-conf-yaml\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894947 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-config-data\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894962 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-log-httpd\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.895069 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.895417 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.895972 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.921904 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-scripts" (OuterVolumeSpecName: "scripts") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.938790 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-kube-api-access-9qwfg" (OuterVolumeSpecName: "kube-api-access-9qwfg") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "kube-api-access-9qwfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.981722 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.001989 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.002025 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qwfg\" (UniqueName: \"kubernetes.io/projected/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-kube-api-access-9qwfg\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.002035 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.002043 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.041422 4781 generic.go:334] "Generic (PLEG): container finished" podID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerID="a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892" exitCode=0 Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.041487 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerDied","Data":"a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892"} Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.041514 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerDied","Data":"8523e5974cb6fe577a148d4d77627c86ea1298c44ff6fdd8db602516c249b5d9"} Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.041533 4781 scope.go:117] "RemoveContainer" containerID="e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.041672 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.050416 4781 generic.go:334] "Generic (PLEG): container finished" podID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerID="ee82fae8a491ded998bd0190a2bb94c2ff762013e3316811c1d1f983f5c06787" exitCode=143 Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.050741 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbbfd856b-vgvjg" event={"ID":"49f24c54-4f24-4f97-a01a-04640bf67b0f","Type":"ContainerDied","Data":"ee82fae8a491ded998bd0190a2bb94c2ff762013e3316811c1d1f983f5c06787"} Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.129257 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.201712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-config-data" (OuterVolumeSpecName: "config-data") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.214819 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.214851 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.236172 4781 scope.go:117] "RemoveContainer" containerID="b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.265870 4781 scope.go:117] "RemoveContainer" containerID="a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.297537 4781 scope.go:117] "RemoveContainer" containerID="e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.297979 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8\": container with ID starting with e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8 not found: ID does not exist" containerID="e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.298010 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8"} err="failed to get container status \"e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8\": rpc error: code = NotFound desc = could not find container \"e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8\": container with ID starting with e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8 not found: ID does not exist" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.298034 4781 scope.go:117] "RemoveContainer" containerID="b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.298497 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717\": container with ID starting with b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717 not found: ID does not exist" containerID="b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.298521 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717"} err="failed to get container status \"b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717\": rpc error: code = NotFound desc = could not find container \"b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717\": container with ID starting with b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717 not found: ID does not exist" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.298536 4781 scope.go:117] "RemoveContainer" containerID="a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.298877 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892\": container with ID starting with a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892 not found: ID does not exist" containerID="a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.298898 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892"} err="failed to get container status \"a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892\": rpc error: code = NotFound desc = could not find container \"a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892\": container with ID starting with a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892 not found: ID does not exist" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.393752 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.414258 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.427349 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.427782 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="proxy-httpd" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.427800 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="proxy-httpd" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.427810 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerName="dnsmasq-dns" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.427817 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerName="dnsmasq-dns" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.427852 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerName="init" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.427859 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerName="init" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.427871 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="ceilometer-notification-agent" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.427878 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="ceilometer-notification-agent" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.427891 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="sg-core" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.427897 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="sg-core" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.428076 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="sg-core" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.428109 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="ceilometer-notification-agent" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.428126 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="proxy-httpd" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.428139 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerName="dnsmasq-dns" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.430175 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.433123 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.433466 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.445719 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.537864 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-scripts\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.537912 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.537978 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-log-httpd\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.538305 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-config-data\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.538377 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lrpr\" (UniqueName: \"kubernetes.io/projected/a732412d-8655-4df0-90ba-1bf854b6d8d1-kube-api-access-2lrpr\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.538420 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-run-httpd\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.538462 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.639918 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-config-data\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.640567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lrpr\" (UniqueName: \"kubernetes.io/projected/a732412d-8655-4df0-90ba-1bf854b6d8d1-kube-api-access-2lrpr\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.640609 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-run-httpd\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.640663 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.640708 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-scripts\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.640739 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.640877 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-log-httpd\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.641290 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-log-httpd\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.641358 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-run-httpd\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.644669 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-config-data\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.645577 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-scripts\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.646188 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.648215 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.660823 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lrpr\" (UniqueName: \"kubernetes.io/projected/a732412d-8655-4df0-90ba-1bf854b6d8d1-kube-api-access-2lrpr\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.753817 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:27:55 crc kubenswrapper[4781]: I0227 00:27:55.060272 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"16cb4c6c-2ddb-41e0-8db3-f44961445474","Type":"ContainerStarted","Data":"1e9ffcbbd25742c01039bd3a97ac6ddd0b05895a54aef1dbf572bec3de71584f"} Feb 27 00:27:55 crc kubenswrapper[4781]: I0227 00:27:55.082522 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.082478184 podStartE2EDuration="4.082478184s" podCreationTimestamp="2026-02-27 00:27:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:55.078824518 +0000 UTC m=+1344.336364072" watchObservedRunningTime="2026-02-27 00:27:55.082478184 +0000 UTC m=+1344.340017728" Feb 27 00:27:55 crc kubenswrapper[4781]: I0227 00:27:55.322806 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" path="/var/lib/kubelet/pods/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d/volumes" Feb 27 00:27:55 crc kubenswrapper[4781]: I0227 00:27:55.348059 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:27:55 crc kubenswrapper[4781]: W0227 00:27:55.349920 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda732412d_8655_4df0_90ba_1bf854b6d8d1.slice/crio-10c0f1e24689995e21992a81df3156a3ac869c2c63cffc5db5d95aae3523ee7b WatchSource:0}: Error finding container 10c0f1e24689995e21992a81df3156a3ac869c2c63cffc5db5d95aae3523ee7b: Status 404 returned error can't find the container with id 10c0f1e24689995e21992a81df3156a3ac869c2c63cffc5db5d95aae3523ee7b Feb 27 00:27:55 crc kubenswrapper[4781]: I0227 00:27:55.988731 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 27 00:27:56 crc kubenswrapper[4781]: I0227 00:27:56.079510 4781 generic.go:334] "Generic (PLEG): container finished" podID="87b3198c-30ab-415a-b24b-b26ab3da838e" containerID="7aaaa3159dfec72ce2bfd72718ace0516b0de685b4c75d813a19d16d4226019b" exitCode=0 Feb 27 00:27:56 crc kubenswrapper[4781]: I0227 00:27:56.079595 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g672n" event={"ID":"87b3198c-30ab-415a-b24b-b26ab3da838e","Type":"ContainerDied","Data":"7aaaa3159dfec72ce2bfd72718ace0516b0de685b4c75d813a19d16d4226019b"} Feb 27 00:27:56 crc kubenswrapper[4781]: I0227 00:27:56.082861 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerStarted","Data":"10c0f1e24689995e21992a81df3156a3ac869c2c63cffc5db5d95aae3523ee7b"} Feb 27 00:27:56 crc kubenswrapper[4781]: I0227 00:27:56.679671 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 27 00:27:56 crc kubenswrapper[4781]: I0227 00:27:56.989053 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fbbfd856b-vgvjg" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.183:9311/healthcheck\": read tcp 10.217.0.2:52142->10.217.0.183:9311: read: connection reset by peer" Feb 27 00:27:56 crc kubenswrapper[4781]: I0227 00:27:56.989389 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fbbfd856b-vgvjg" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.183:9311/healthcheck\": read tcp 10.217.0.2:52134->10.217.0.183:9311: read: connection reset by peer" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.101283 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerStarted","Data":"14d3fcba4ac0c08489e958e9281bb38dcd169375967f24d085bf95a4995989d3"} Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.101332 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerStarted","Data":"57f8cc16c7b772f63445194c5db7782e3fcd2bdad4a28c47f2161d0b1572b6c9"} Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.104649 4781 generic.go:334] "Generic (PLEG): container finished" podID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerID="0f274fbd09031e3b8e38174b2cbe52a6c6f5f24b60283aea8a0e1a01875fd8b1" exitCode=0 Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.104998 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbbfd856b-vgvjg" event={"ID":"49f24c54-4f24-4f97-a01a-04640bf67b0f","Type":"ContainerDied","Data":"0f274fbd09031e3b8e38174b2cbe52a6c6f5f24b60283aea8a0e1a01875fd8b1"} Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.396914 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.497935 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-combined-ca-bundle\") pod \"49f24c54-4f24-4f97-a01a-04640bf67b0f\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.498060 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d79w\" (UniqueName: \"kubernetes.io/projected/49f24c54-4f24-4f97-a01a-04640bf67b0f-kube-api-access-5d79w\") pod \"49f24c54-4f24-4f97-a01a-04640bf67b0f\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.498179 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49f24c54-4f24-4f97-a01a-04640bf67b0f-logs\") pod \"49f24c54-4f24-4f97-a01a-04640bf67b0f\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.498206 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data\") pod \"49f24c54-4f24-4f97-a01a-04640bf67b0f\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.498268 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data-custom\") pod \"49f24c54-4f24-4f97-a01a-04640bf67b0f\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.500174 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f24c54-4f24-4f97-a01a-04640bf67b0f-logs" (OuterVolumeSpecName: "logs") pod "49f24c54-4f24-4f97-a01a-04640bf67b0f" (UID: "49f24c54-4f24-4f97-a01a-04640bf67b0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.516860 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "49f24c54-4f24-4f97-a01a-04640bf67b0f" (UID: "49f24c54-4f24-4f97-a01a-04640bf67b0f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.529469 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f24c54-4f24-4f97-a01a-04640bf67b0f-kube-api-access-5d79w" (OuterVolumeSpecName: "kube-api-access-5d79w") pod "49f24c54-4f24-4f97-a01a-04640bf67b0f" (UID: "49f24c54-4f24-4f97-a01a-04640bf67b0f"). InnerVolumeSpecName "kube-api-access-5d79w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.532162 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49f24c54-4f24-4f97-a01a-04640bf67b0f" (UID: "49f24c54-4f24-4f97-a01a-04640bf67b0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.575353 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data" (OuterVolumeSpecName: "config-data") pod "49f24c54-4f24-4f97-a01a-04640bf67b0f" (UID: "49f24c54-4f24-4f97-a01a-04640bf67b0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.600493 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49f24c54-4f24-4f97-a01a-04640bf67b0f-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.600526 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.600538 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.600548 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.600570 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d79w\" (UniqueName: \"kubernetes.io/projected/49f24c54-4f24-4f97-a01a-04640bf67b0f-kube-api-access-5d79w\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.603095 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.701981 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbrfx\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-kube-api-access-zbrfx\") pod \"87b3198c-30ab-415a-b24b-b26ab3da838e\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.702022 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-combined-ca-bundle\") pod \"87b3198c-30ab-415a-b24b-b26ab3da838e\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.702059 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-scripts\") pod \"87b3198c-30ab-415a-b24b-b26ab3da838e\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.702119 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-certs\") pod \"87b3198c-30ab-415a-b24b-b26ab3da838e\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.702196 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-config-data\") pod \"87b3198c-30ab-415a-b24b-b26ab3da838e\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.707955 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-kube-api-access-zbrfx" (OuterVolumeSpecName: "kube-api-access-zbrfx") pod "87b3198c-30ab-415a-b24b-b26ab3da838e" (UID: "87b3198c-30ab-415a-b24b-b26ab3da838e"). InnerVolumeSpecName "kube-api-access-zbrfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.711537 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-certs" (OuterVolumeSpecName: "certs") pod "87b3198c-30ab-415a-b24b-b26ab3da838e" (UID: "87b3198c-30ab-415a-b24b-b26ab3da838e"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.719395 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-scripts" (OuterVolumeSpecName: "scripts") pod "87b3198c-30ab-415a-b24b-b26ab3da838e" (UID: "87b3198c-30ab-415a-b24b-b26ab3da838e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.745252 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87b3198c-30ab-415a-b24b-b26ab3da838e" (UID: "87b3198c-30ab-415a-b24b-b26ab3da838e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.746484 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-config-data" (OuterVolumeSpecName: "config-data") pod "87b3198c-30ab-415a-b24b-b26ab3da838e" (UID: "87b3198c-30ab-415a-b24b-b26ab3da838e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.804393 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbrfx\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-kube-api-access-zbrfx\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.804741 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.804821 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.804884 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.804937 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.117500 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g672n" event={"ID":"87b3198c-30ab-415a-b24b-b26ab3da838e","Type":"ContainerDied","Data":"5ea5e68fe7fb3730a14c055ae47e47a12d2ed4ea16d87ceb507c87aa6e875602"} Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.118571 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ea5e68fe7fb3730a14c055ae47e47a12d2ed4ea16d87ceb507c87aa6e875602" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.118733 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.127274 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerStarted","Data":"41f966adea97cdc475ad08a86a255ece7a9da3613c19d0f63f5a59a5a293320f"} Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.129825 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbbfd856b-vgvjg" event={"ID":"49f24c54-4f24-4f97-a01a-04640bf67b0f","Type":"ContainerDied","Data":"abf074d9baa2f3d6e8969094139a58da187066e40f9840d7df7ac1542a6fb7f6"} Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.130021 4781 scope.go:117] "RemoveContainer" containerID="0f274fbd09031e3b8e38174b2cbe52a6c6f5f24b60283aea8a0e1a01875fd8b1" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.129924 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.162871 4781 scope.go:117] "RemoveContainer" containerID="ee82fae8a491ded998bd0190a2bb94c2ff762013e3316811c1d1f983f5c06787" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.185476 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5fbbfd856b-vgvjg"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.199600 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5fbbfd856b-vgvjg"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.301297 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:27:58 crc kubenswrapper[4781]: E0227 00:27:58.301683 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.301695 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api" Feb 27 00:27:58 crc kubenswrapper[4781]: E0227 00:27:58.301712 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api-log" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.301718 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api-log" Feb 27 00:27:58 crc kubenswrapper[4781]: E0227 00:27:58.301728 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b3198c-30ab-415a-b24b-b26ab3da838e" containerName="cloudkitty-storageinit" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.301735 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b3198c-30ab-415a-b24b-b26ab3da838e" containerName="cloudkitty-storageinit" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.302762 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api-log" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.302780 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.302791 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="87b3198c-30ab-415a-b24b-b26ab3da838e" containerName="cloudkitty-storageinit" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.303425 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.309978 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.310167 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.310272 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.310471 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.310576 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-qt68h" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.318589 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.376209 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-5mf9t"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.378616 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.400972 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-5mf9t"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.416836 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.417098 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-svc\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.417220 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.417326 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4nk7\" (UniqueName: \"kubernetes.io/projected/39b2afc0-76d7-48e9-8528-f88e3ba22955-kube-api-access-w4nk7\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.417402 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-config\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.417487 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.520788 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.521536 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.521643 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.521862 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522037 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4nk7\" (UniqueName: \"kubernetes.io/projected/39b2afc0-76d7-48e9-8528-f88e3ba22955-kube-api-access-w4nk7\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522142 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-config\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522232 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522310 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-scripts\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522528 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522689 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522918 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-svc\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522975 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-config\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522987 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-certs\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.523053 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mj2t\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-kube-api-access-5mj2t\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.523398 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.523436 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.523586 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-svc\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.545457 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4nk7\" (UniqueName: \"kubernetes.io/projected/39b2afc0-76d7-48e9-8528-f88e3ba22955-kube-api-access-w4nk7\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.567027 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.568732 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.573480 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.598607 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.625270 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-scripts\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.625361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.625498 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-certs\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.625528 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mj2t\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-kube-api-access-5mj2t\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.625649 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.625713 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.641168 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-scripts\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.641302 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.646305 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.646835 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-certs\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.653999 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mj2t\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-kube-api-access-5mj2t\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.656535 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730355 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-scripts\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730402 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730466 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a21c78-44f9-4e7a-81cc-8488b0fd942a-logs\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730487 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-certs\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730548 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730575 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvdwn\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-kube-api-access-cvdwn\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730600 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.741596 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832034 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvdwn\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-kube-api-access-cvdwn\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832095 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832166 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-scripts\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832193 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832255 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a21c78-44f9-4e7a-81cc-8488b0fd942a-logs\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832280 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-certs\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.833798 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a21c78-44f9-4e7a-81cc-8488b0fd942a-logs\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.837235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.841426 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.853591 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-certs\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.857239 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-scripts\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.861057 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvdwn\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-kube-api-access-cvdwn\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.864597 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.907595 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.949972 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:27:59 crc kubenswrapper[4781]: I0227 00:27:59.322901 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" path="/var/lib/kubelet/pods/49f24c54-4f24-4f97-a01a-04640bf67b0f/volumes" Feb 27 00:27:59 crc kubenswrapper[4781]: I0227 00:27:59.337508 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-5mf9t"] Feb 27 00:27:59 crc kubenswrapper[4781]: I0227 00:27:59.510512 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:27:59 crc kubenswrapper[4781]: I0227 00:27:59.648548 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:27:59 crc kubenswrapper[4781]: W0227 00:27:59.694037 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode66fa513_66e6_4821_ad96_4bfe56e359f1.slice/crio-5565ddef70981fe2780f1609d8fa35f56abe3f10d059edc749ee7568f1b9b3fe WatchSource:0}: Error finding container 5565ddef70981fe2780f1609d8fa35f56abe3f10d059edc749ee7568f1b9b3fe: Status 404 returned error can't find the container with id 5565ddef70981fe2780f1609d8fa35f56abe3f10d059edc749ee7568f1b9b3fe Feb 27 00:27:59 crc kubenswrapper[4781]: I0227 00:27:59.991039 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.122129 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.138177 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535868-f5csp"] Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.139901 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.146215 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.146446 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.146695 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.155245 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535868-f5csp"] Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.184880 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"02a21c78-44f9-4e7a-81cc-8488b0fd942a","Type":"ContainerStarted","Data":"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f"} Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.184927 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"02a21c78-44f9-4e7a-81cc-8488b0fd942a","Type":"ContainerStarted","Data":"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9"} Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.184937 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"02a21c78-44f9-4e7a-81cc-8488b0fd942a","Type":"ContainerStarted","Data":"210ddd6c96ce311a55b219a45ca27f47a76e0b915886957693e3838eb107b875"} Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.185846 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.192731 4781 generic.go:334] "Generic (PLEG): container finished" podID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerID="ba0fa606453c74eda00c418113d9f320bbbe55741c968eedcc82d3ff7571054d" exitCode=0 Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.192794 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" event={"ID":"39b2afc0-76d7-48e9-8528-f88e3ba22955","Type":"ContainerDied","Data":"ba0fa606453c74eda00c418113d9f320bbbe55741c968eedcc82d3ff7571054d"} Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.192822 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" event={"ID":"39b2afc0-76d7-48e9-8528-f88e3ba22955","Type":"ContainerStarted","Data":"ede845938dcbb2c0e3303591186eb47bf17d10a92d1b0dd61b8430ff2dd6aa13"} Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.196856 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e66fa513-66e6-4821-ad96-4bfe56e359f1","Type":"ContainerStarted","Data":"5565ddef70981fe2780f1609d8fa35f56abe3f10d059edc749ee7568f1b9b3fe"} Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.227235 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.227213951 podStartE2EDuration="2.227213951s" podCreationTimestamp="2026-02-27 00:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:00.218013008 +0000 UTC m=+1349.475552562" watchObservedRunningTime="2026-02-27 00:28:00.227213951 +0000 UTC m=+1349.484753505" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.288442 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb9mz\" (UniqueName: \"kubernetes.io/projected/f3df72f1-7ac9-4877-a7b4-a17b5c724303-kube-api-access-fb9mz\") pod \"auto-csr-approver-29535868-f5csp\" (UID: \"f3df72f1-7ac9-4877-a7b4-a17b5c724303\") " pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.390525 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb9mz\" (UniqueName: \"kubernetes.io/projected/f3df72f1-7ac9-4877-a7b4-a17b5c724303-kube-api-access-fb9mz\") pod \"auto-csr-approver-29535868-f5csp\" (UID: \"f3df72f1-7ac9-4877-a7b4-a17b5c724303\") " pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.412330 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb9mz\" (UniqueName: \"kubernetes.io/projected/f3df72f1-7ac9-4877-a7b4-a17b5c724303-kube-api-access-fb9mz\") pod \"auto-csr-approver-29535868-f5csp\" (UID: \"f3df72f1-7ac9-4877-a7b4-a17b5c724303\") " pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.459483 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.896075 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535868-f5csp"] Feb 27 00:28:00 crc kubenswrapper[4781]: W0227 00:28:00.942734 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3df72f1_7ac9_4877_a7b4_a17b5c724303.slice/crio-e55a7008a665e255411e030af55e4960e0ce58f4f35f88a35bcbcae2103d9e43 WatchSource:0}: Error finding container e55a7008a665e255411e030af55e4960e0ce58f4f35f88a35bcbcae2103d9e43: Status 404 returned error can't find the container with id e55a7008a665e255411e030af55e4960e0ce58f4f35f88a35bcbcae2103d9e43 Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.213473 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535868-f5csp" event={"ID":"f3df72f1-7ac9-4877-a7b4-a17b5c724303","Type":"ContainerStarted","Data":"e55a7008a665e255411e030af55e4960e0ce58f4f35f88a35bcbcae2103d9e43"} Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.220307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" event={"ID":"39b2afc0-76d7-48e9-8528-f88e3ba22955","Type":"ContainerStarted","Data":"bb8c0d69bd70d80999cf07d7e8306d44a8648ef91de2762edd1a659e5f8fb1d6"} Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.221575 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.227524 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerStarted","Data":"e1347c9105935db12132917f879bb29404eb3328ae4b62fde3c6673f55672741"} Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.227563 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.239772 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" podStartSLOduration=3.239753891 podStartE2EDuration="3.239753891s" podCreationTimestamp="2026-02-27 00:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:01.237581253 +0000 UTC m=+1350.495120827" watchObservedRunningTime="2026-02-27 00:28:01.239753891 +0000 UTC m=+1350.497293445" Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.267782 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.636315804 podStartE2EDuration="7.267763461s" podCreationTimestamp="2026-02-27 00:27:54 +0000 UTC" firstStartedPulling="2026-02-27 00:27:55.352065066 +0000 UTC m=+1344.609604620" lastFinishedPulling="2026-02-27 00:27:59.983512723 +0000 UTC m=+1349.241052277" observedRunningTime="2026-02-27 00:28:01.265647155 +0000 UTC m=+1350.523186709" watchObservedRunningTime="2026-02-27 00:28:01.267763461 +0000 UTC m=+1350.525303015" Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.628390 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.939416 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.248317 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e66fa513-66e6-4821-ad96-4bfe56e359f1","Type":"ContainerStarted","Data":"fdc3e6d3767980267676c3bb178abb962b9ba33efb09510bb187625fd32978dd"} Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.250096 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535868-f5csp" event={"ID":"f3df72f1-7ac9-4877-a7b4-a17b5c724303","Type":"ContainerStarted","Data":"58983f3a0d32568b0a106e31b532196dd7e3e78ec29a99f5dc4c44649ec4e605"} Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.250599 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api-log" containerID="cri-o://452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9" gracePeriod=30 Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.250668 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api" containerID="cri-o://2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f" gracePeriod=30 Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.273196 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.205754485 podStartE2EDuration="5.273162751s" podCreationTimestamp="2026-02-27 00:27:58 +0000 UTC" firstStartedPulling="2026-02-27 00:27:59.69685193 +0000 UTC m=+1348.954391494" lastFinishedPulling="2026-02-27 00:28:02.764260206 +0000 UTC m=+1352.021799760" observedRunningTime="2026-02-27 00:28:03.26670315 +0000 UTC m=+1352.524242704" watchObservedRunningTime="2026-02-27 00:28:03.273162751 +0000 UTC m=+1352.530702305" Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.307326 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.321494 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535868-f5csp" podStartSLOduration=1.518159815 podStartE2EDuration="3.321474987s" podCreationTimestamp="2026-02-27 00:28:00 +0000 UTC" firstStartedPulling="2026-02-27 00:28:00.961639423 +0000 UTC m=+1350.219178967" lastFinishedPulling="2026-02-27 00:28:02.764954585 +0000 UTC m=+1352.022494139" observedRunningTime="2026-02-27 00:28:03.301725916 +0000 UTC m=+1352.559265470" watchObservedRunningTime="2026-02-27 00:28:03.321474987 +0000 UTC m=+1352.579014541" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.120853 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267144 4781 generic.go:334] "Generic (PLEG): container finished" podID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerID="2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f" exitCode=0 Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267181 4781 generic.go:334] "Generic (PLEG): container finished" podID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerID="452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9" exitCode=143 Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267318 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267855 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"02a21c78-44f9-4e7a-81cc-8488b0fd942a","Type":"ContainerDied","Data":"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f"} Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267905 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"02a21c78-44f9-4e7a-81cc-8488b0fd942a","Type":"ContainerDied","Data":"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9"} Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267919 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"02a21c78-44f9-4e7a-81cc-8488b0fd942a","Type":"ContainerDied","Data":"210ddd6c96ce311a55b219a45ca27f47a76e0b915886957693e3838eb107b875"} Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267976 4781 scope.go:117] "RemoveContainer" containerID="2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.270507 4781 generic.go:334] "Generic (PLEG): container finished" podID="f3df72f1-7ac9-4877-a7b4-a17b5c724303" containerID="58983f3a0d32568b0a106e31b532196dd7e3e78ec29a99f5dc4c44649ec4e605" exitCode=0 Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.270569 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535868-f5csp" event={"ID":"f3df72f1-7ac9-4877-a7b4-a17b5c724303","Type":"ContainerDied","Data":"58983f3a0d32568b0a106e31b532196dd7e3e78ec29a99f5dc4c44649ec4e605"} Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.277857 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a21c78-44f9-4e7a-81cc-8488b0fd942a-logs\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.277935 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-scripts\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278075 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-combined-ca-bundle\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278115 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data-custom\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278142 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278230 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-certs\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278263 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvdwn\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-kube-api-access-cvdwn\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278334 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02a21c78-44f9-4e7a-81cc-8488b0fd942a-logs" (OuterVolumeSpecName: "logs") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278842 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a21c78-44f9-4e7a-81cc-8488b0fd942a-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.288778 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-scripts" (OuterVolumeSpecName: "scripts") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.313589 4781 scope.go:117] "RemoveContainer" containerID="452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.321114 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.321527 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-certs" (OuterVolumeSpecName: "certs") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.325069 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-kube-api-access-cvdwn" (OuterVolumeSpecName: "kube-api-access-cvdwn") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "kube-api-access-cvdwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.338306 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.381128 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.381157 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.381167 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.381176 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvdwn\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-kube-api-access-cvdwn\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.381186 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.428822 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data" (OuterVolumeSpecName: "config-data") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.478405 4781 scope.go:117] "RemoveContainer" containerID="2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f" Feb 27 00:28:04 crc kubenswrapper[4781]: E0227 00:28:04.478918 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f\": container with ID starting with 2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f not found: ID does not exist" containerID="2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.478992 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f"} err="failed to get container status \"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f\": rpc error: code = NotFound desc = could not find container \"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f\": container with ID starting with 2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f not found: ID does not exist" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.479024 4781 scope.go:117] "RemoveContainer" containerID="452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9" Feb 27 00:28:04 crc kubenswrapper[4781]: E0227 00:28:04.479333 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9\": container with ID starting with 452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9 not found: ID does not exist" containerID="452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.479381 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9"} err="failed to get container status \"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9\": rpc error: code = NotFound desc = could not find container \"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9\": container with ID starting with 452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9 not found: ID does not exist" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.479410 4781 scope.go:117] "RemoveContainer" containerID="2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.479689 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f"} err="failed to get container status \"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f\": rpc error: code = NotFound desc = could not find container \"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f\": container with ID starting with 2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f not found: ID does not exist" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.479708 4781 scope.go:117] "RemoveContainer" containerID="452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.479922 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9"} err="failed to get container status \"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9\": rpc error: code = NotFound desc = could not find container \"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9\": container with ID starting with 452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9 not found: ID does not exist" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.483184 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.503517 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.604050 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.612427 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.624606 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:28:04 crc kubenswrapper[4781]: E0227 00:28:04.625206 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.625289 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api" Feb 27 00:28:04 crc kubenswrapper[4781]: E0227 00:28:04.625373 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api-log" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.625434 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api-log" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.625682 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api-log" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.625766 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.626815 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.628684 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.628848 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.638074 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.644268 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795143 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795210 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795306 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptk2k\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-kube-api-access-ptk2k\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795326 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75721c64-91e7-468b-8157-9f7b0f8060b0-logs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795368 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-scripts\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795386 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795408 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795455 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795488 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.896934 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897046 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897082 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897124 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptk2k\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-kube-api-access-ptk2k\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897143 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75721c64-91e7-468b-8157-9f7b0f8060b0-logs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897184 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-scripts\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897200 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897228 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897278 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.898777 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75721c64-91e7-468b-8157-9f7b0f8060b0-logs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.902297 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-scripts\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.903088 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.907078 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.914230 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.921305 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.922700 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.940247 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.952291 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptk2k\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-kube-api-access-ptk2k\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.978029 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:28:05 crc kubenswrapper[4781]: I0227 00:28:05.297903 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="e66fa513-66e6-4821-ad96-4bfe56e359f1" containerName="cloudkitty-proc" containerID="cri-o://fdc3e6d3767980267676c3bb178abb962b9ba33efb09510bb187625fd32978dd" gracePeriod=30 Feb 27 00:28:05 crc kubenswrapper[4781]: I0227 00:28:05.338013 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" path="/var/lib/kubelet/pods/02a21c78-44f9-4e7a-81cc-8488b0fd942a/volumes" Feb 27 00:28:05 crc kubenswrapper[4781]: I0227 00:28:05.710294 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:28:05 crc kubenswrapper[4781]: I0227 00:28:05.852668 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:28:05 crc kubenswrapper[4781]: I0227 00:28:05.962817 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d64c6bb46-jcp5p"] Feb 27 00:28:05 crc kubenswrapper[4781]: I0227 00:28:05.964511 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.002022 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d64c6bb46-jcp5p"] Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046304 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-logs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046375 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-scripts\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046528 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-config-data\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046557 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-combined-ca-bundle\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046582 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rrz7\" (UniqueName: \"kubernetes.io/projected/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-kube-api-access-4rrz7\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046805 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-internal-tls-certs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046838 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-public-tls-certs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157757 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-logs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157807 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-scripts\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157857 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-config-data\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157884 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-combined-ca-bundle\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157908 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rrz7\" (UniqueName: \"kubernetes.io/projected/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-kube-api-access-4rrz7\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157982 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-internal-tls-certs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157999 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-public-tls-certs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.159325 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-logs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.164971 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.165996 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-scripts\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.167666 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-combined-ca-bundle\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.168711 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-config-data\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.168757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-public-tls-certs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.186666 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-internal-tls-certs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.195272 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rrz7\" (UniqueName: \"kubernetes.io/projected/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-kube-api-access-4rrz7\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.321499 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"75721c64-91e7-468b-8157-9f7b0f8060b0","Type":"ContainerStarted","Data":"d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241"} Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.321538 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"75721c64-91e7-468b-8157-9f7b0f8060b0","Type":"ContainerStarted","Data":"27f0f2f53c09daadc606bc872e1f5df520a0c8f2a01549f894ec755d7a09a157"} Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.327869 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535868-f5csp" event={"ID":"f3df72f1-7ac9-4877-a7b4-a17b5c724303","Type":"ContainerDied","Data":"e55a7008a665e255411e030af55e4960e0ce58f4f35f88a35bcbcae2103d9e43"} Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.327900 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e55a7008a665e255411e030af55e4960e0ce58f4f35f88a35bcbcae2103d9e43" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.327947 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.366928 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb9mz\" (UniqueName: \"kubernetes.io/projected/f3df72f1-7ac9-4877-a7b4-a17b5c724303-kube-api-access-fb9mz\") pod \"f3df72f1-7ac9-4877-a7b4-a17b5c724303\" (UID: \"f3df72f1-7ac9-4877-a7b4-a17b5c724303\") " Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.371028 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3df72f1-7ac9-4877-a7b4-a17b5c724303-kube-api-access-fb9mz" (OuterVolumeSpecName: "kube-api-access-fb9mz") pod "f3df72f1-7ac9-4877-a7b4-a17b5c724303" (UID: "f3df72f1-7ac9-4877-a7b4-a17b5c724303"). InnerVolumeSpecName "kube-api-access-fb9mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.446235 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.469464 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb9mz\" (UniqueName: \"kubernetes.io/projected/f3df72f1-7ac9-4877-a7b4-a17b5c724303-kube-api-access-fb9mz\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.554102 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.959681 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d64c6bb46-jcp5p"] Feb 27 00:28:06 crc kubenswrapper[4781]: W0227 00:28:06.959750 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ff35aa7_7e5a_4069_8dc4_392e01a957e3.slice/crio-5959b2f6e7fff1a2a51f3463d90c725e2eb81981c7fd050dcdeb36b804b2be0b WatchSource:0}: Error finding container 5959b2f6e7fff1a2a51f3463d90c725e2eb81981c7fd050dcdeb36b804b2be0b: Status 404 returned error can't find the container with id 5959b2f6e7fff1a2a51f3463d90c725e2eb81981c7fd050dcdeb36b804b2be0b Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.253894 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535862-l9vc5"] Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.266420 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535862-l9vc5"] Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.326854 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="411dc0f9-584c-453b-a137-189ab8731570" path="/var/lib/kubelet/pods/411dc0f9-584c-453b-a137-189ab8731570/volumes" Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.343370 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d64c6bb46-jcp5p" event={"ID":"5ff35aa7-7e5a-4069-8dc4-392e01a957e3","Type":"ContainerStarted","Data":"8844002fa649bbfd29001c87d27f71d4621e3669a75ccf85fd1badf27a87e1a6"} Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.343410 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d64c6bb46-jcp5p" event={"ID":"5ff35aa7-7e5a-4069-8dc4-392e01a957e3","Type":"ContainerStarted","Data":"5959b2f6e7fff1a2a51f3463d90c725e2eb81981c7fd050dcdeb36b804b2be0b"} Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.347484 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"75721c64-91e7-468b-8157-9f7b0f8060b0","Type":"ContainerStarted","Data":"e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a"} Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.347760 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.385541 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.385523224 podStartE2EDuration="3.385523224s" podCreationTimestamp="2026-02-27 00:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:07.375307534 +0000 UTC m=+1356.632847088" watchObservedRunningTime="2026-02-27 00:28:07.385523224 +0000 UTC m=+1356.643062778" Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.365775 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d64c6bb46-jcp5p" event={"ID":"5ff35aa7-7e5a-4069-8dc4-392e01a957e3","Type":"ContainerStarted","Data":"e34be0223d33d9c3740d3959a945adda9dff38a4a277f5d5d8122ac8617d942e"} Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.366136 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.366465 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.399502 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d64c6bb46-jcp5p" podStartSLOduration=3.399485211 podStartE2EDuration="3.399485211s" podCreationTimestamp="2026-02-27 00:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:08.388112471 +0000 UTC m=+1357.645652025" watchObservedRunningTime="2026-02-27 00:28:08.399485211 +0000 UTC m=+1357.657024765" Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.742797 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.798012 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-btbp6"] Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.798512 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerName="dnsmasq-dns" containerID="cri-o://8eb943556508c5cc9103fa044300406224b9b4973d8e501d8f7538f1c3573e24" gracePeriod=10 Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.242156 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:09 crc kubenswrapper[4781]: E0227 00:28:09.242710 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3df72f1-7ac9-4877-a7b4-a17b5c724303" containerName="oc" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.242734 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3df72f1-7ac9-4877-a7b4-a17b5c724303" containerName="oc" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.242971 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3df72f1-7ac9-4877-a7b4-a17b5c724303" containerName="oc" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.243860 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.245585 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.251999 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.252196 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.252599 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-x24nv" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.348734 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnh77\" (UniqueName: \"kubernetes.io/projected/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-kube-api-access-pnh77\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.349175 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.349256 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.350819 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.386386 4781 generic.go:334] "Generic (PLEG): container finished" podID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerID="8eb943556508c5cc9103fa044300406224b9b4973d8e501d8f7538f1c3573e24" exitCode=0 Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.386446 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" event={"ID":"f2a90e98-bb9f-436d-9a1c-8aebd91000e3","Type":"ContainerDied","Data":"8eb943556508c5cc9103fa044300406224b9b4973d8e501d8f7538f1c3573e24"} Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.386473 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" event={"ID":"f2a90e98-bb9f-436d-9a1c-8aebd91000e3","Type":"ContainerDied","Data":"f5601a008ad9454c1a7af70c2d0c5712b2a38f8540f6108d4eb74d5c92b8bcd7"} Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.386484 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5601a008ad9454c1a7af70c2d0c5712b2a38f8540f6108d4eb74d5c92b8bcd7" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.388168 4781 generic.go:334] "Generic (PLEG): container finished" podID="e66fa513-66e6-4821-ad96-4bfe56e359f1" containerID="fdc3e6d3767980267676c3bb178abb962b9ba33efb09510bb187625fd32978dd" exitCode=0 Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.389246 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e66fa513-66e6-4821-ad96-4bfe56e359f1","Type":"ContainerDied","Data":"fdc3e6d3767980267676c3bb178abb962b9ba33efb09510bb187625fd32978dd"} Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.454784 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnh77\" (UniqueName: \"kubernetes.io/projected/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-kube-api-access-pnh77\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.454920 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.454944 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.454985 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.459177 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.463158 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.463877 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.472178 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnh77\" (UniqueName: \"kubernetes.io/projected/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-kube-api-access-pnh77\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.519833 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.520744 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.542755 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.544409 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.551951 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557158 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-sb\") pod \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557267 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-nb\") pod \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557306 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-scripts\") pod \"e66fa513-66e6-4821-ad96-4bfe56e359f1\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557342 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data\") pod \"e66fa513-66e6-4821-ad96-4bfe56e359f1\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557361 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-combined-ca-bundle\") pod \"e66fa513-66e6-4821-ad96-4bfe56e359f1\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557400 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-certs\") pod \"e66fa513-66e6-4821-ad96-4bfe56e359f1\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557429 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-swift-storage-0\") pod \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557502 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-config\") pod \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557536 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w96d\" (UniqueName: \"kubernetes.io/projected/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-kube-api-access-4w96d\") pod \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557558 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-svc\") pod \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data-custom\") pod \"e66fa513-66e6-4821-ad96-4bfe56e359f1\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557608 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mj2t\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-kube-api-access-5mj2t\") pod \"e66fa513-66e6-4821-ad96-4bfe56e359f1\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.558430 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:09 crc kubenswrapper[4781]: E0227 00:28:09.558822 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66fa513-66e6-4821-ad96-4bfe56e359f1" containerName="cloudkitty-proc" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.558845 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66fa513-66e6-4821-ad96-4bfe56e359f1" containerName="cloudkitty-proc" Feb 27 00:28:09 crc kubenswrapper[4781]: E0227 00:28:09.558877 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerName="init" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.558884 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerName="init" Feb 27 00:28:09 crc kubenswrapper[4781]: E0227 00:28:09.558899 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerName="dnsmasq-dns" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.558906 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerName="dnsmasq-dns" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.559091 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerName="dnsmasq-dns" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.559118 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e66fa513-66e6-4821-ad96-4bfe56e359f1" containerName="cloudkitty-proc" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.560404 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.565945 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-kube-api-access-5mj2t" (OuterVolumeSpecName: "kube-api-access-5mj2t") pod "e66fa513-66e6-4821-ad96-4bfe56e359f1" (UID: "e66fa513-66e6-4821-ad96-4bfe56e359f1"). InnerVolumeSpecName "kube-api-access-5mj2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.566799 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-scripts" (OuterVolumeSpecName: "scripts") pod "e66fa513-66e6-4821-ad96-4bfe56e359f1" (UID: "e66fa513-66e6-4821-ad96-4bfe56e359f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.577037 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-kube-api-access-4w96d" (OuterVolumeSpecName: "kube-api-access-4w96d") pod "f2a90e98-bb9f-436d-9a1c-8aebd91000e3" (UID: "f2a90e98-bb9f-436d-9a1c-8aebd91000e3"). InnerVolumeSpecName "kube-api-access-4w96d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.583494 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e66fa513-66e6-4821-ad96-4bfe56e359f1" (UID: "e66fa513-66e6-4821-ad96-4bfe56e359f1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.594826 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-certs" (OuterVolumeSpecName: "certs") pod "e66fa513-66e6-4821-ad96-4bfe56e359f1" (UID: "e66fa513-66e6-4821-ad96-4bfe56e359f1"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.611703 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.611972 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e66fa513-66e6-4821-ad96-4bfe56e359f1" (UID: "e66fa513-66e6-4821-ad96-4bfe56e359f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661118 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/02c4875e-e180-4365-a00a-828ab5d95c34-openstack-config-secret\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661190 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5rtv\" (UniqueName: \"kubernetes.io/projected/02c4875e-e180-4365-a00a-828ab5d95c34-kube-api-access-l5rtv\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661538 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/02c4875e-e180-4365-a00a-828ab5d95c34-openstack-config\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c4875e-e180-4365-a00a-828ab5d95c34-combined-ca-bundle\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661897 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661913 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661928 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661938 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w96d\" (UniqueName: \"kubernetes.io/projected/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-kube-api-access-4w96d\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661947 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661956 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mj2t\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-kube-api-access-5mj2t\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.670794 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data" (OuterVolumeSpecName: "config-data") pod "e66fa513-66e6-4821-ad96-4bfe56e359f1" (UID: "e66fa513-66e6-4821-ad96-4bfe56e359f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.671478 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f2a90e98-bb9f-436d-9a1c-8aebd91000e3" (UID: "f2a90e98-bb9f-436d-9a1c-8aebd91000e3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.692889 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-config" (OuterVolumeSpecName: "config") pod "f2a90e98-bb9f-436d-9a1c-8aebd91000e3" (UID: "f2a90e98-bb9f-436d-9a1c-8aebd91000e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: E0227 00:28:09.725068 4781 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 27 00:28:09 crc kubenswrapper[4781]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_23ff7bad-67ec-4ef6-b3b9-c997a99a62b4_0(6bca2a3521cbb1795eee380ff04718caf1ceb0e3915c9a371a67015f320b28d5): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6bca2a3521cbb1795eee380ff04718caf1ceb0e3915c9a371a67015f320b28d5" Netns:"/var/run/netns/c947b9e9-9f00-44f6-85ce-84835c04cc12" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=6bca2a3521cbb1795eee380ff04718caf1ceb0e3915c9a371a67015f320b28d5;K8S_POD_UID=23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4]: expected pod UID "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" but got "02c4875e-e180-4365-a00a-828ab5d95c34" from Kube API Feb 27 00:28:09 crc kubenswrapper[4781]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 00:28:09 crc kubenswrapper[4781]: > Feb 27 00:28:09 crc kubenswrapper[4781]: E0227 00:28:09.725599 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 27 00:28:09 crc kubenswrapper[4781]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_23ff7bad-67ec-4ef6-b3b9-c997a99a62b4_0(6bca2a3521cbb1795eee380ff04718caf1ceb0e3915c9a371a67015f320b28d5): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6bca2a3521cbb1795eee380ff04718caf1ceb0e3915c9a371a67015f320b28d5" Netns:"/var/run/netns/c947b9e9-9f00-44f6-85ce-84835c04cc12" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=6bca2a3521cbb1795eee380ff04718caf1ceb0e3915c9a371a67015f320b28d5;K8S_POD_UID=23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4]: expected pod UID "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" but got "02c4875e-e180-4365-a00a-828ab5d95c34" from Kube API Feb 27 00:28:09 crc kubenswrapper[4781]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 00:28:09 crc kubenswrapper[4781]: > pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.725598 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f2a90e98-bb9f-436d-9a1c-8aebd91000e3" (UID: "f2a90e98-bb9f-436d-9a1c-8aebd91000e3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.741867 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2a90e98-bb9f-436d-9a1c-8aebd91000e3" (UID: "f2a90e98-bb9f-436d-9a1c-8aebd91000e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.744380 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f2a90e98-bb9f-436d-9a1c-8aebd91000e3" (UID: "f2a90e98-bb9f-436d-9a1c-8aebd91000e3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.762978 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c4875e-e180-4365-a00a-828ab5d95c34-combined-ca-bundle\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.763194 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/02c4875e-e180-4365-a00a-828ab5d95c34-openstack-config-secret\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.763327 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5rtv\" (UniqueName: \"kubernetes.io/projected/02c4875e-e180-4365-a00a-828ab5d95c34-kube-api-access-l5rtv\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.763743 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/02c4875e-e180-4365-a00a-828ab5d95c34-openstack-config\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.763898 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.763974 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.764040 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.764098 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.764721 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.764793 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.765614 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/02c4875e-e180-4365-a00a-828ab5d95c34-openstack-config\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.766573 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c4875e-e180-4365-a00a-828ab5d95c34-combined-ca-bundle\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.766996 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/02c4875e-e180-4365-a00a-828ab5d95c34-openstack-config-secret\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.778162 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5rtv\" (UniqueName: \"kubernetes.io/projected/02c4875e-e180-4365-a00a-828ab5d95c34-kube-api-access-l5rtv\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.905981 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.409730 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.411531 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.411534 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e66fa513-66e6-4821-ad96-4bfe56e359f1","Type":"ContainerDied","Data":"5565ddef70981fe2780f1609d8fa35f56abe3f10d059edc749ee7568f1b9b3fe"} Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.411581 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.411606 4781 scope.go:117] "RemoveContainer" containerID="fdc3e6d3767980267676c3bb178abb962b9ba33efb09510bb187625fd32978dd" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.411581 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.424558 4781 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" podUID="02c4875e-e180-4365-a00a-828ab5d95c34" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.466925 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.478232 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-combined-ca-bundle\") pod \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.478303 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config\") pod \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.478454 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config-secret\") pod \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.478587 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnh77\" (UniqueName: \"kubernetes.io/projected/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-kube-api-access-pnh77\") pod \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.478972 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" (UID: "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.479460 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.490206 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" (UID: "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.490431 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.490394 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-kube-api-access-pnh77" (OuterVolumeSpecName: "kube-api-access-pnh77") pod "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" (UID: "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4"). InnerVolumeSpecName "kube-api-access-pnh77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.493764 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" (UID: "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.524499 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.537695 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-btbp6"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.561824 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.566734 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.570000 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.573208 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-btbp6"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.582255 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.585166 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.586231 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwnr7\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-kube-api-access-qwnr7\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.586357 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.586551 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.587738 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-scripts\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.587875 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-certs\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.588170 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnh77\" (UniqueName: \"kubernetes.io/projected/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-kube-api-access-pnh77\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.588231 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.588286 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.690417 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.690994 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwnr7\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-kube-api-access-qwnr7\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.691093 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.691207 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.691318 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-scripts\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.691440 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-certs\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.695757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.696281 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-scripts\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.696439 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-certs\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.696541 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.697092 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.707691 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwnr7\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-kube-api-access-qwnr7\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.941059 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.340728 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" path="/var/lib/kubelet/pods/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4/volumes" Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.342529 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e66fa513-66e6-4821-ad96-4bfe56e359f1" path="/var/lib/kubelet/pods/e66fa513-66e6-4821-ad96-4bfe56e359f1/volumes" Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.343997 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" path="/var/lib/kubelet/pods/f2a90e98-bb9f-436d-9a1c-8aebd91000e3/volumes" Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.423649 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"02c4875e-e180-4365-a00a-828ab5d95c34","Type":"ContainerStarted","Data":"27bb25f3bd1c0aa5e7843534c38a1b39afba807fb84921a9189815b48bf5d197"} Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.431113 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.436539 4781 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" podUID="02c4875e-e180-4365-a00a-828ab5d95c34" Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.491292 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:28:11 crc kubenswrapper[4781]: W0227 00:28:11.514794 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb34a476_dd22_4085_bb2c_a8e57b0d9889.slice/crio-3977a787677f2d7efa8ed48ba20f2b14940f8edb53792eab26f57aa6da59e0d1 WatchSource:0}: Error finding container 3977a787677f2d7efa8ed48ba20f2b14940f8edb53792eab26f57aa6da59e0d1: Status 404 returned error can't find the container with id 3977a787677f2d7efa8ed48ba20f2b14940f8edb53792eab26f57aa6da59e0d1 Feb 27 00:28:12 crc kubenswrapper[4781]: I0227 00:28:12.446056 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"db34a476-dd22-4085-bb2c-a8e57b0d9889","Type":"ContainerStarted","Data":"b68ba0c8681372d6b46f2eda69405ebdb37954378e7c21495676222df54a1d3a"} Feb 27 00:28:12 crc kubenswrapper[4781]: I0227 00:28:12.446639 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"db34a476-dd22-4085-bb2c-a8e57b0d9889","Type":"ContainerStarted","Data":"3977a787677f2d7efa8ed48ba20f2b14940f8edb53792eab26f57aa6da59e0d1"} Feb 27 00:28:12 crc kubenswrapper[4781]: I0227 00:28:12.467952 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.467936862 podStartE2EDuration="2.467936862s" podCreationTimestamp="2026-02-27 00:28:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:12.461615711 +0000 UTC m=+1361.719155265" watchObservedRunningTime="2026-02-27 00:28:12.467936862 +0000 UTC m=+1361.725476416" Feb 27 00:28:13 crc kubenswrapper[4781]: I0227 00:28:13.893155 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:28:13 crc kubenswrapper[4781]: I0227 00:28:13.986231 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5445c56cbd-fmcjz"] Feb 27 00:28:13 crc kubenswrapper[4781]: I0227 00:28:13.986475 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5445c56cbd-fmcjz" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-api" containerID="cri-o://0384541fca62a0c17aeca1e73d81a12f432aa7f744f83cfe8433a7d935539961" gracePeriod=30 Feb 27 00:28:13 crc kubenswrapper[4781]: I0227 00:28:13.986600 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5445c56cbd-fmcjz" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-httpd" containerID="cri-o://24cfe8100e51fc567495df6c5f9d60a27bc0381d4315fb54a1ac7e37d2a6bf89" gracePeriod=30 Feb 27 00:28:14 crc kubenswrapper[4781]: I0227 00:28:14.516822 4781 generic.go:334] "Generic (PLEG): container finished" podID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerID="24cfe8100e51fc567495df6c5f9d60a27bc0381d4315fb54a1ac7e37d2a6bf89" exitCode=0 Feb 27 00:28:14 crc kubenswrapper[4781]: I0227 00:28:14.516871 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5445c56cbd-fmcjz" event={"ID":"34294cdd-a18f-4453-8d43-c4d1290e3c59","Type":"ContainerDied","Data":"24cfe8100e51fc567495df6c5f9d60a27bc0381d4315fb54a1ac7e37d2a6bf89"} Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.757388 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5c945d84cf-z5v9s"] Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.769326 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c945d84cf-z5v9s"] Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.769424 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.772989 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.773087 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.773269 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895130 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-combined-ca-bundle\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895191 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8ba5117-540f-448d-aac6-6fde482f5f14-log-httpd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895321 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8ba5117-540f-448d-aac6-6fde482f5f14-run-httpd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895377 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52dfd\" (UniqueName: \"kubernetes.io/projected/e8ba5117-540f-448d-aac6-6fde482f5f14-kube-api-access-52dfd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895448 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-public-tls-certs\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895648 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-config-data\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895856 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-internal-tls-certs\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895905 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e8ba5117-540f-448d-aac6-6fde482f5f14-etc-swift\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.999098 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8ba5117-540f-448d-aac6-6fde482f5f14-log-httpd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.999161 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8ba5117-540f-448d-aac6-6fde482f5f14-run-httpd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.999207 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52dfd\" (UniqueName: \"kubernetes.io/projected/e8ba5117-540f-448d-aac6-6fde482f5f14-kube-api-access-52dfd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.999258 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-public-tls-certs\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.999367 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-config-data\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:16.999609 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8ba5117-540f-448d-aac6-6fde482f5f14-log-httpd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:16.999676 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8ba5117-540f-448d-aac6-6fde482f5f14-run-httpd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:16.999541 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-internal-tls-certs\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.000153 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e8ba5117-540f-448d-aac6-6fde482f5f14-etc-swift\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.000244 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-combined-ca-bundle\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.007049 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-internal-tls-certs\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.007998 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e8ba5117-540f-448d-aac6-6fde482f5f14-etc-swift\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.008869 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-config-data\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.009555 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-combined-ca-bundle\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.020180 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-public-tls-certs\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.020560 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52dfd\" (UniqueName: \"kubernetes.io/projected/e8ba5117-540f-448d-aac6-6fde482f5f14-kube-api-access-52dfd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.107495 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.705077 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.705812 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-central-agent" containerID="cri-o://14d3fcba4ac0c08489e958e9281bb38dcd169375967f24d085bf95a4995989d3" gracePeriod=30 Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.706399 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="proxy-httpd" containerID="cri-o://e1347c9105935db12132917f879bb29404eb3328ae4b62fde3c6673f55672741" gracePeriod=30 Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.706441 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-notification-agent" containerID="cri-o://57f8cc16c7b772f63445194c5db7782e3fcd2bdad4a28c47f2161d0b1572b6c9" gracePeriod=30 Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.706393 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="sg-core" containerID="cri-o://41f966adea97cdc475ad08a86a255ece7a9da3613c19d0f63f5a59a5a293320f" gracePeriod=30 Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.718186 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.192:3000/\": EOF" Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.565267 4781 generic.go:334] "Generic (PLEG): container finished" podID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerID="0384541fca62a0c17aeca1e73d81a12f432aa7f744f83cfe8433a7d935539961" exitCode=0 Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.565332 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5445c56cbd-fmcjz" event={"ID":"34294cdd-a18f-4453-8d43-c4d1290e3c59","Type":"ContainerDied","Data":"0384541fca62a0c17aeca1e73d81a12f432aa7f744f83cfe8433a7d935539961"} Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.568244 4781 generic.go:334] "Generic (PLEG): container finished" podID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerID="e1347c9105935db12132917f879bb29404eb3328ae4b62fde3c6673f55672741" exitCode=0 Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.568262 4781 generic.go:334] "Generic (PLEG): container finished" podID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerID="41f966adea97cdc475ad08a86a255ece7a9da3613c19d0f63f5a59a5a293320f" exitCode=2 Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.568270 4781 generic.go:334] "Generic (PLEG): container finished" podID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerID="14d3fcba4ac0c08489e958e9281bb38dcd169375967f24d085bf95a4995989d3" exitCode=0 Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.568284 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerDied","Data":"e1347c9105935db12132917f879bb29404eb3328ae4b62fde3c6673f55672741"} Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.568297 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerDied","Data":"41f966adea97cdc475ad08a86a255ece7a9da3613c19d0f63f5a59a5a293320f"} Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.568307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerDied","Data":"14d3fcba4ac0c08489e958e9281bb38dcd169375967f24d085bf95a4995989d3"} Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.766871 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.767106 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-log" containerID="cri-o://5483ad8c7ab58752a9371dfb8baad38f002e9c4bc521ec62a86d28db8755aca8" gracePeriod=30 Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.767206 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-httpd" containerID="cri-o://fd234a650b390b48c2c62ec04eb6c4e5afa5d6f4b0db395429958fa19cde51f2" gracePeriod=30 Feb 27 00:28:19 crc kubenswrapper[4781]: I0227 00:28:19.583886 4781 generic.go:334] "Generic (PLEG): container finished" podID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerID="5483ad8c7ab58752a9371dfb8baad38f002e9c4bc521ec62a86d28db8755aca8" exitCode=143 Feb 27 00:28:19 crc kubenswrapper[4781]: I0227 00:28:19.583922 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb47b6b2-760a-4899-84f6-fdf1bd62a418","Type":"ContainerDied","Data":"5483ad8c7ab58752a9371dfb8baad38f002e9c4bc521ec62a86d28db8755aca8"} Feb 27 00:28:19 crc kubenswrapper[4781]: I0227 00:28:19.711537 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:28:19 crc kubenswrapper[4781]: I0227 00:28:19.711843 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-log" containerID="cri-o://1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d" gracePeriod=30 Feb 27 00:28:19 crc kubenswrapper[4781]: I0227 00:28:19.711938 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-httpd" containerID="cri-o://16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614" gracePeriod=30 Feb 27 00:28:20 crc kubenswrapper[4781]: I0227 00:28:20.597046 4781 generic.go:334] "Generic (PLEG): container finished" podID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerID="1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d" exitCode=143 Feb 27 00:28:20 crc kubenswrapper[4781]: I0227 00:28:20.597090 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ef40468-5e47-4e34-a641-bfbe7803d480","Type":"ContainerDied","Data":"1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d"} Feb 27 00:28:21 crc kubenswrapper[4781]: I0227 00:28:21.611248 4781 generic.go:334] "Generic (PLEG): container finished" podID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerID="57f8cc16c7b772f63445194c5db7782e3fcd2bdad4a28c47f2161d0b1572b6c9" exitCode=0 Feb 27 00:28:21 crc kubenswrapper[4781]: I0227 00:28:21.611339 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerDied","Data":"57f8cc16c7b772f63445194c5db7782e3fcd2bdad4a28c47f2161d0b1572b6c9"} Feb 27 00:28:22 crc kubenswrapper[4781]: I0227 00:28:22.662995 4781 generic.go:334] "Generic (PLEG): container finished" podID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerID="fd234a650b390b48c2c62ec04eb6c4e5afa5d6f4b0db395429958fa19cde51f2" exitCode=0 Feb 27 00:28:22 crc kubenswrapper[4781]: I0227 00:28:22.663086 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb47b6b2-760a-4899-84f6-fdf1bd62a418","Type":"ContainerDied","Data":"fd234a650b390b48c2c62ec04eb6c4e5afa5d6f4b0db395429958fa19cde51f2"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.018284 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.102108 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130010 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-log-httpd\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130072 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-combined-ca-bundle\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130104 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-config-data\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130139 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-run-httpd\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130204 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-sg-core-conf-yaml\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130297 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lrpr\" (UniqueName: \"kubernetes.io/projected/a732412d-8655-4df0-90ba-1bf854b6d8d1-kube-api-access-2lrpr\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130387 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-scripts\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.131931 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.135751 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.141791 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-scripts" (OuterVolumeSpecName: "scripts") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.144537 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a732412d-8655-4df0-90ba-1bf854b6d8d1-kube-api-access-2lrpr" (OuterVolumeSpecName: "kube-api-access-2lrpr") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "kube-api-access-2lrpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.173776 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.225757 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.231691 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-combined-ca-bundle\") pod \"34294cdd-a18f-4453-8d43-c4d1290e3c59\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.231770 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-ovndb-tls-certs\") pod \"34294cdd-a18f-4453-8d43-c4d1290e3c59\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.231878 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-httpd-config\") pod \"34294cdd-a18f-4453-8d43-c4d1290e3c59\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.231912 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b22wp\" (UniqueName: \"kubernetes.io/projected/34294cdd-a18f-4453-8d43-c4d1290e3c59-kube-api-access-b22wp\") pod \"34294cdd-a18f-4453-8d43-c4d1290e3c59\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.231972 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-config\") pod \"34294cdd-a18f-4453-8d43-c4d1290e3c59\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.232455 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.232466 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.232476 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.232484 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.232493 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lrpr\" (UniqueName: \"kubernetes.io/projected/a732412d-8655-4df0-90ba-1bf854b6d8d1-kube-api-access-2lrpr\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.232500 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.237153 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "34294cdd-a18f-4453-8d43-c4d1290e3c59" (UID: "34294cdd-a18f-4453-8d43-c4d1290e3c59"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.239799 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34294cdd-a18f-4453-8d43-c4d1290e3c59-kube-api-access-b22wp" (OuterVolumeSpecName: "kube-api-access-b22wp") pod "34294cdd-a18f-4453-8d43-c4d1290e3c59" (UID: "34294cdd-a18f-4453-8d43-c4d1290e3c59"). InnerVolumeSpecName "kube-api-access-b22wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: W0227 00:28:23.275722 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8ba5117_540f_448d_aac6_6fde482f5f14.slice/crio-724c37c7eaa0223157095c6adcc27778c7a840b1a037151c828e000ed66010e1 WatchSource:0}: Error finding container 724c37c7eaa0223157095c6adcc27778c7a840b1a037151c828e000ed66010e1: Status 404 returned error can't find the container with id 724c37c7eaa0223157095c6adcc27778c7a840b1a037151c828e000ed66010e1 Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.280902 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-config-data" (OuterVolumeSpecName: "config-data") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.283188 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c945d84cf-z5v9s"] Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.333719 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.333745 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b22wp\" (UniqueName: \"kubernetes.io/projected/34294cdd-a18f-4453-8d43-c4d1290e3c59-kube-api-access-b22wp\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.333756 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.339814 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-config" (OuterVolumeSpecName: "config") pod "34294cdd-a18f-4453-8d43-c4d1290e3c59" (UID: "34294cdd-a18f-4453-8d43-c4d1290e3c59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.411687 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "34294cdd-a18f-4453-8d43-c4d1290e3c59" (UID: "34294cdd-a18f-4453-8d43-c4d1290e3c59"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.427786 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34294cdd-a18f-4453-8d43-c4d1290e3c59" (UID: "34294cdd-a18f-4453-8d43-c4d1290e3c59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.435818 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.435842 4781 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.435851 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.573509 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.640770 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641124 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-internal-tls-certs\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641157 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-scripts\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641240 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-config-data\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641280 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-logs\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641318 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-combined-ca-bundle\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641449 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-httpd-run\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641504 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc459\" (UniqueName: \"kubernetes.io/projected/6ef40468-5e47-4e34-a641-bfbe7803d480-kube-api-access-kc459\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.642954 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-logs" (OuterVolumeSpecName: "logs") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.643580 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.650261 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-scripts" (OuterVolumeSpecName: "scripts") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.711940 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerDied","Data":"10c0f1e24689995e21992a81df3156a3ac869c2c63cffc5db5d95aae3523ee7b"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.712032 4781 scope.go:117] "RemoveContainer" containerID="e1347c9105935db12132917f879bb29404eb3328ae4b62fde3c6673f55672741" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.712428 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.725124 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c945d84cf-z5v9s" event={"ID":"e8ba5117-540f-448d-aac6-6fde482f5f14","Type":"ContainerStarted","Data":"724c37c7eaa0223157095c6adcc27778c7a840b1a037151c828e000ed66010e1"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.728878 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef40468-5e47-4e34-a641-bfbe7803d480-kube-api-access-kc459" (OuterVolumeSpecName: "kube-api-access-kc459") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "kube-api-access-kc459". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.744201 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.744249 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.744259 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.744268 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc459\" (UniqueName: \"kubernetes.io/projected/6ef40468-5e47-4e34-a641-bfbe7803d480-kube-api-access-kc459\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.767878 4781 generic.go:334] "Generic (PLEG): container finished" podID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerID="16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614" exitCode=0 Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.767939 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ef40468-5e47-4e34-a641-bfbe7803d480","Type":"ContainerDied","Data":"16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.767964 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ef40468-5e47-4e34-a641-bfbe7803d480","Type":"ContainerDied","Data":"7d0ca3340d609e18433fc291df1d484624d9e133542d96a4dff1a09c6cf6905a"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.768021 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.804189 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5445c56cbd-fmcjz" event={"ID":"34294cdd-a18f-4453-8d43-c4d1290e3c59","Type":"ContainerDied","Data":"35abaddf64ded29044d57543bd49dba6fb7cc622e405ec56e6449b1f79234b7a"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.804294 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.823878 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"02c4875e-e180-4365-a00a-828ab5d95c34","Type":"ContainerStarted","Data":"e39e425f056b37a8d5613652dc4f63f4d1fcd7f56dac5cb4ab465c2f4cfc31b4"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.876333 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.704420909 podStartE2EDuration="14.876314524s" podCreationTimestamp="2026-02-27 00:28:09 +0000 UTC" firstStartedPulling="2026-02-27 00:28:10.408724793 +0000 UTC m=+1359.666264337" lastFinishedPulling="2026-02-27 00:28:22.580618398 +0000 UTC m=+1371.838157952" observedRunningTime="2026-02-27 00:28:23.861777071 +0000 UTC m=+1373.119316625" watchObservedRunningTime="2026-02-27 00:28:23.876314524 +0000 UTC m=+1373.133854068" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.933790 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549" (OuterVolumeSpecName: "glance") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "pvc-5bfae319-10bf-453e-8fc6-7da85b46e549". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.948124 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") on node \"crc\" " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.000601 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.005159 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-config-data" (OuterVolumeSpecName: "config-data") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.005535 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.005713 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5bfae319-10bf-453e-8fc6-7da85b46e549" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549") on node "crc" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.039009 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.050337 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.050647 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.050659 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.050668 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.102663 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.153556 4781 scope.go:117] "RemoveContainer" containerID="41f966adea97cdc475ad08a86a255ece7a9da3613c19d0f63f5a59a5a293320f" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.154293 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-config-data\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155050 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155128 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-logs\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155163 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-httpd-run\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155204 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-combined-ca-bundle\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155389 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-public-tls-certs\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155644 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155667 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-scripts\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155687 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pftsh\" (UniqueName: \"kubernetes.io/projected/cb47b6b2-760a-4899-84f6-fdf1bd62a418-kube-api-access-pftsh\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-logs" (OuterVolumeSpecName: "logs") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.156399 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.156416 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.158499 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-scripts" (OuterVolumeSpecName: "scripts") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.163731 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb47b6b2-760a-4899-84f6-fdf1bd62a418-kube-api-access-pftsh" (OuterVolumeSpecName: "kube-api-access-pftsh") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "kube-api-access-pftsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.178964 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5445c56cbd-fmcjz"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.215265 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5445c56cbd-fmcjz"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.227783 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.237954 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.267801 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.279589 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.301736 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302817 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-central-agent" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302843 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-central-agent" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302861 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="proxy-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302869 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="proxy-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302881 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-notification-agent" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302887 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-notification-agent" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302895 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-log" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302905 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-log" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302931 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302939 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302961 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="sg-core" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302967 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="sg-core" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302991 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-api" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302998 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-api" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.303022 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303027 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.303046 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-log" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303053 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-log" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.303064 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303070 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303607 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="sg-core" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303648 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-central-agent" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303668 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-log" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303682 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="proxy-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303703 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303730 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303744 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303760 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-notification-agent" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303784 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-api" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303796 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-log" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.307087 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.309716 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.310564 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b podName:cb47b6b2-760a-4899-84f6-fdf1bd62a418 nodeName:}" failed. No retries permitted until 2026-02-27 00:28:24.810539755 +0000 UTC m=+1374.068079319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "glance" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.310994 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pftsh\" (UniqueName: \"kubernetes.io/projected/cb47b6b2-760a-4899-84f6-fdf1bd62a418-kube-api-access-pftsh\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.319753 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.319893 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.346990 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.361581 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.383209 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.386591 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.387536 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-config-data" (OuterVolumeSpecName: "config-data") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.394501 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.396772 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.403488 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.403694 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.403934 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.412771 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.413006 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-run-httpd\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.413111 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-log-httpd\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.413324 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-scripts\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.413402 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.413476 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkfb8\" (UniqueName: \"kubernetes.io/projected/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-kube-api-access-qkfb8\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.413599 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-config-data\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.414602 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.414696 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.479428 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.481746 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-qkfb8 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.516911 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-scripts\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.516948 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-scripts\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.516972 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.516999 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517019 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkfb8\" (UniqueName: \"kubernetes.io/projected/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-kube-api-access-qkfb8\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517075 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-config-data\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517110 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/141465f3-d299-4d9c-a74f-0df5c741e325-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517135 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-config-data\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517209 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517245 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517265 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-run-httpd\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517347 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/141465f3-d299-4d9c-a74f-0df5c741e325-logs\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517434 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcz5t\" (UniqueName: \"kubernetes.io/projected/141465f3-d299-4d9c-a74f-0df5c741e325-kube-api-access-fcz5t\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517505 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-log-httpd\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517595 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-run-httpd\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.518026 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-log-httpd\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.521240 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.521675 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.522142 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-config-data\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.523894 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-scripts\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.547348 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkfb8\" (UniqueName: \"kubernetes.io/projected/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-kube-api-access-qkfb8\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619560 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619670 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619696 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-config-data\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619758 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/141465f3-d299-4d9c-a74f-0df5c741e325-logs\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619797 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcz5t\" (UniqueName: \"kubernetes.io/projected/141465f3-d299-4d9c-a74f-0df5c741e325-kube-api-access-fcz5t\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619903 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-scripts\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619968 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.620022 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/141465f3-d299-4d9c-a74f-0df5c741e325-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.620481 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/141465f3-d299-4d9c-a74f-0df5c741e325-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.621991 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/141465f3-d299-4d9c-a74f-0df5c741e325-logs\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.626167 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.626221 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-scripts\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.626882 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-config-data\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.627194 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.627364 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.627389 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a7b96405e17327882846f95b5adf8b290f3f24e0a3e5cf6d272cf20133e6cae4/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.642875 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcz5t\" (UniqueName: \"kubernetes.io/projected/141465f3-d299-4d9c-a74f-0df5c741e325-kube-api-access-fcz5t\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.678921 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.776781 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.824268 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.836492 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c945d84cf-z5v9s" event={"ID":"e8ba5117-540f-448d-aac6-6fde482f5f14","Type":"ContainerStarted","Data":"1017d28e75e458419a53d6ae77f42ce11681962ac06d8080c95f2799a44c6f64"} Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.839557 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb47b6b2-760a-4899-84f6-fdf1bd62a418","Type":"ContainerDied","Data":"866ec8dc8dd6eea2cbe5498cddcd7820b1b7a00e1ecb5ebf3196c3d57588106d"} Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.839653 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.843285 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.854263 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.894329 4781 scope.go:117] "RemoveContainer" containerID="576df563fec491fe4b88b02b86a929d4019c459ebde0d69bbe30c74025de222c" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.926680 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-sg-core-conf-yaml\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.926747 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-config-data\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.926843 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-log-httpd\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.926915 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkfb8\" (UniqueName: \"kubernetes.io/projected/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-kube-api-access-qkfb8\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.926972 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-run-httpd\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.927078 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-scripts\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.927103 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-combined-ca-bundle\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.927440 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.927655 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.928186 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.928206 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.322399 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" path="/var/lib/kubelet/pods/34294cdd-a18f-4453-8d43-c4d1290e3c59/volumes" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.323422 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" path="/var/lib/kubelet/pods/6ef40468-5e47-4e34-a641-bfbe7803d480/volumes" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.324117 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" path="/var/lib/kubelet/pods/a732412d-8655-4df0-90ba-1bf854b6d8d1/volumes" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.332014 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-kube-api-access-qkfb8" (OuterVolumeSpecName: "kube-api-access-qkfb8") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "kube-api-access-qkfb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.332602 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-scripts" (OuterVolumeSpecName: "scripts") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.332617 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.334703 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.335214 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-config-data" (OuterVolumeSpecName: "config-data") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.339170 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.339217 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.339236 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.339257 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.339275 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkfb8\" (UniqueName: \"kubernetes.io/projected/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-kube-api-access-qkfb8\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.351052 4781 scope.go:117] "RemoveContainer" containerID="57f8cc16c7b772f63445194c5db7782e3fcd2bdad4a28c47f2161d0b1572b6c9" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.366717 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b" (OuterVolumeSpecName: "glance") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.413875 4781 scope.go:117] "RemoveContainer" containerID="14d3fcba4ac0c08489e958e9281bb38dcd169375967f24d085bf95a4995989d3" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.441966 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") on node \"crc\" " Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.507743 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.512967 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.513099 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b") on node "crc" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.526683 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.537842 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.539873 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.542031 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.542241 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.543553 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.548588 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.606113 4781 scope.go:117] "RemoveContainer" containerID="16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645394 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-logs\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645454 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645524 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645574 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645608 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645666 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645681 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8dlt\" (UniqueName: \"kubernetes.io/projected/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-kube-api-access-s8dlt\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645699 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.653377 4781 scope.go:117] "RemoveContainer" containerID="1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.719833 4781 scope.go:117] "RemoveContainer" containerID="16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614" Feb 27 00:28:25 crc kubenswrapper[4781]: E0227 00:28:25.720325 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614\": container with ID starting with 16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614 not found: ID does not exist" containerID="16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.720366 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614"} err="failed to get container status \"16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614\": rpc error: code = NotFound desc = could not find container \"16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614\": container with ID starting with 16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614 not found: ID does not exist" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.720396 4781 scope.go:117] "RemoveContainer" containerID="1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d" Feb 27 00:28:25 crc kubenswrapper[4781]: E0227 00:28:25.720699 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d\": container with ID starting with 1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d not found: ID does not exist" containerID="1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.720734 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d"} err="failed to get container status \"1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d\": rpc error: code = NotFound desc = could not find container \"1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d\": container with ID starting with 1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d not found: ID does not exist" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.720763 4781 scope.go:117] "RemoveContainer" containerID="24cfe8100e51fc567495df6c5f9d60a27bc0381d4315fb54a1ac7e37d2a6bf89" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.747983 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-logs\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.748034 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.748101 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.748151 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.748200 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.748980 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-logs\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.749069 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8dlt\" (UniqueName: \"kubernetes.io/projected/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-kube-api-access-s8dlt\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.749096 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.749124 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.749855 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.756248 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.756454 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.765851 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.766357 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.766386 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5d3045414bd1cd74ec61e0394ba262493610c57a87bbc940ef275e8fc1bc2ecf/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.772183 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8dlt\" (UniqueName: \"kubernetes.io/projected/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-kube-api-access-s8dlt\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.783141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.796441 4781 scope.go:117] "RemoveContainer" containerID="0384541fca62a0c17aeca1e73d81a12f432aa7f744f83cfe8433a7d935539961" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.831942 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.863279 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c945d84cf-z5v9s" event={"ID":"e8ba5117-540f-448d-aac6-6fde482f5f14","Type":"ContainerStarted","Data":"c1b2f287d85fff96968a0f89edc99d93ebe479cc29c7781601d414abffec3f4f"} Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.864410 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.864451 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.871275 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.901613 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.921055 4781 scope.go:117] "RemoveContainer" containerID="fd234a650b390b48c2c62ec04eb6c4e5afa5d6f4b0db395429958fa19cde51f2" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.951032 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5c945d84cf-z5v9s" podStartSLOduration=9.950664744000001 podStartE2EDuration="9.950664744s" podCreationTimestamp="2026-02-27 00:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:25.900388436 +0000 UTC m=+1375.157928000" watchObservedRunningTime="2026-02-27 00:28:25.950664744 +0000 UTC m=+1375.208204298" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.989200 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.000303 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.007652 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.014585 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.016365 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.023956 4781 scope.go:117] "RemoveContainer" containerID="5483ad8c7ab58752a9371dfb8baad38f002e9c4bc521ec62a86d28db8755aca8" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.024537 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.025766 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056116 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-log-httpd\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056200 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4sp2\" (UniqueName: \"kubernetes.io/projected/546554c7-b0b0-4363-b1f8-6f83d43562cc-kube-api-access-s4sp2\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056306 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-config-data\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056425 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056452 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-run-httpd\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-scripts\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.084190 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.157869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-config-data\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.157910 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.157933 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-run-httpd\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.157952 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-scripts\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.157981 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-log-httpd\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.158026 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4sp2\" (UniqueName: \"kubernetes.io/projected/546554c7-b0b0-4363-b1f8-6f83d43562cc-kube-api-access-s4sp2\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.158120 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: W0227 00:28:26.158787 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod141465f3_d299_4d9c_a74f_0df5c741e325.slice/crio-92cc791906e49e4f9f91cf90b227b97c529aa4271f277080ed3b0e4a9ce26cba WatchSource:0}: Error finding container 92cc791906e49e4f9f91cf90b227b97c529aa4271f277080ed3b0e4a9ce26cba: Status 404 returned error can't find the container with id 92cc791906e49e4f9f91cf90b227b97c529aa4271f277080ed3b0e4a9ce26cba Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.161789 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-log-httpd\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.162955 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-scripts\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.164387 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.164823 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.165072 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-config-data\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.167957 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-run-httpd\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.177296 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4sp2\" (UniqueName: \"kubernetes.io/projected/546554c7-b0b0-4363-b1f8-6f83d43562cc-kube-api-access-s4sp2\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.422767 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.631340 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.893663 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f","Type":"ContainerStarted","Data":"4aa356d4f49dee1250771eb9ae06ae0ace500dad12850e3423a5c80de402db7c"} Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.898105 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"141465f3-d299-4d9c-a74f-0df5c741e325","Type":"ContainerStarted","Data":"92cc791906e49e4f9f91cf90b227b97c529aa4271f277080ed3b0e4a9ce26cba"} Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.012204 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.339666 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" path="/var/lib/kubelet/pods/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5/volumes" Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.340548 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" path="/var/lib/kubelet/pods/cb47b6b2-760a-4899-84f6-fdf1bd62a418/volumes" Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.936951 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f","Type":"ContainerStarted","Data":"18dc837276b2a2b0958acc6d515dddb9fbb4d20a0b4c318bea5123475d01df79"} Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.943444 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"141465f3-d299-4d9c-a74f-0df5c741e325","Type":"ContainerStarted","Data":"da6a90edd57a33fc09015e81e4fdc592ea63eee487b65d1d6bc96eef651c7157"} Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.943472 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"141465f3-d299-4d9c-a74f-0df5c741e325","Type":"ContainerStarted","Data":"a7629cfeae5afc1dd28b6ff30a5da85200d90e321ba1ec6789f35584daece76c"} Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.950280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerStarted","Data":"a513fdece7b001c5868cd23c79b7671562961e01d9db2105289006ccfa1d5641"} Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.976961 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.976943187 podStartE2EDuration="3.976943187s" podCreationTimestamp="2026-02-27 00:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:27.96593032 +0000 UTC m=+1377.223469874" watchObservedRunningTime="2026-02-27 00:28:27.976943187 +0000 UTC m=+1377.234482731" Feb 27 00:28:28 crc kubenswrapper[4781]: I0227 00:28:28.972330 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f","Type":"ContainerStarted","Data":"81a2f54c4f1eb14da95bab33aa5d3a5c0ed38b81e46bf563e55946cf4995cc42"} Feb 27 00:28:28 crc kubenswrapper[4781]: I0227 00:28:28.981771 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerStarted","Data":"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486"} Feb 27 00:28:28 crc kubenswrapper[4781]: I0227 00:28:28.981952 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerStarted","Data":"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8"} Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.010109 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.010070039 podStartE2EDuration="4.010070039s" podCreationTimestamp="2026-02-27 00:28:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:29.008106866 +0000 UTC m=+1378.265646430" watchObservedRunningTime="2026-02-27 00:28:29.010070039 +0000 UTC m=+1378.267609593" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.829437 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-kcmlj"] Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.831146 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.845596 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kcmlj"] Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.852126 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gw86\" (UniqueName: \"kubernetes.io/projected/7f0e335a-e4a1-48ee-b470-a6277acc5dae-kube-api-access-9gw86\") pod \"nova-api-db-create-kcmlj\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.852261 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0e335a-e4a1-48ee-b470-a6277acc5dae-operator-scripts\") pod \"nova-api-db-create-kcmlj\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.936642 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-lgv6f"] Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.938576 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.951991 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-9245-account-create-update-j6hsh"] Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.953658 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.953953 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0e335a-e4a1-48ee-b470-a6277acc5dae-operator-scripts\") pod \"nova-api-db-create-kcmlj\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.954101 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb5sg\" (UniqueName: \"kubernetes.io/projected/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-kube-api-access-nb5sg\") pod \"nova-cell0-db-create-lgv6f\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.954153 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-operator-scripts\") pod \"nova-cell0-db-create-lgv6f\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.954233 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gw86\" (UniqueName: \"kubernetes.io/projected/7f0e335a-e4a1-48ee-b470-a6277acc5dae-kube-api-access-9gw86\") pod \"nova-api-db-create-kcmlj\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.954837 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0e335a-e4a1-48ee-b470-a6277acc5dae-operator-scripts\") pod \"nova-api-db-create-kcmlj\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.956196 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.991503 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gw86\" (UniqueName: \"kubernetes.io/projected/7f0e335a-e4a1-48ee-b470-a6277acc5dae-kube-api-access-9gw86\") pod \"nova-api-db-create-kcmlj\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.991574 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lgv6f"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.005129 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerStarted","Data":"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd"} Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.013003 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9245-account-create-update-j6hsh"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.055907 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb5sg\" (UniqueName: \"kubernetes.io/projected/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-kube-api-access-nb5sg\") pod \"nova-cell0-db-create-lgv6f\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.055984 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-operator-scripts\") pod \"nova-cell0-db-create-lgv6f\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.056059 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6795d880-5f00-4be4-9c67-6f8a251550cb-operator-scripts\") pod \"nova-api-9245-account-create-update-j6hsh\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.056182 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt8x7\" (UniqueName: \"kubernetes.io/projected/6795d880-5f00-4be4-9c67-6f8a251550cb-kube-api-access-vt8x7\") pod \"nova-api-9245-account-create-update-j6hsh\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.059032 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-operator-scripts\") pod \"nova-cell0-db-create-lgv6f\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.075147 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-qx8nd"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.076618 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.086478 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qx8nd"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.088189 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb5sg\" (UniqueName: \"kubernetes.io/projected/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-kube-api-access-nb5sg\") pod \"nova-cell0-db-create-lgv6f\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.156785 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.158145 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt8x7\" (UniqueName: \"kubernetes.io/projected/6795d880-5f00-4be4-9c67-6f8a251550cb-kube-api-access-vt8x7\") pod \"nova-api-9245-account-create-update-j6hsh\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.158241 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6795d880-5f00-4be4-9c67-6f8a251550cb-operator-scripts\") pod \"nova-api-9245-account-create-update-j6hsh\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.158870 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6795d880-5f00-4be4-9c67-6f8a251550cb-operator-scripts\") pod \"nova-api-9245-account-create-update-j6hsh\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.158992 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-141e-account-create-update-msmcr"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.161570 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.167156 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.180161 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-141e-account-create-update-msmcr"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.181820 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt8x7\" (UniqueName: \"kubernetes.io/projected/6795d880-5f00-4be4-9c67-6f8a251550cb-kube-api-access-vt8x7\") pod \"nova-api-9245-account-create-update-j6hsh\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.259864 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7468389a-cc9b-404c-9414-4d81f3b1a7e5-operator-scripts\") pod \"nova-cell1-db-create-qx8nd\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.259949 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc7ql\" (UniqueName: \"kubernetes.io/projected/7468389a-cc9b-404c-9414-4d81f3b1a7e5-kube-api-access-sc7ql\") pod \"nova-cell1-db-create-qx8nd\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.266276 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.281427 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.347848 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cd3e-account-create-update-dkxt7"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.349424 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.352152 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.359811 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cd3e-account-create-update-dkxt7"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.362514 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7468389a-cc9b-404c-9414-4d81f3b1a7e5-operator-scripts\") pod \"nova-cell1-db-create-qx8nd\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.362599 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc7ql\" (UniqueName: \"kubernetes.io/projected/7468389a-cc9b-404c-9414-4d81f3b1a7e5-kube-api-access-sc7ql\") pod \"nova-cell1-db-create-qx8nd\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.362862 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-operator-scripts\") pod \"nova-cell0-141e-account-create-update-msmcr\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.362907 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqc85\" (UniqueName: \"kubernetes.io/projected/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-kube-api-access-cqc85\") pod \"nova-cell0-141e-account-create-update-msmcr\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.364099 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7468389a-cc9b-404c-9414-4d81f3b1a7e5-operator-scripts\") pod \"nova-cell1-db-create-qx8nd\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.394402 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc7ql\" (UniqueName: \"kubernetes.io/projected/7468389a-cc9b-404c-9414-4d81f3b1a7e5-kube-api-access-sc7ql\") pod \"nova-cell1-db-create-qx8nd\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.442826 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.466808 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-operator-scripts\") pod \"nova-cell0-141e-account-create-update-msmcr\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.466868 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqc85\" (UniqueName: \"kubernetes.io/projected/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-kube-api-access-cqc85\") pod \"nova-cell0-141e-account-create-update-msmcr\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.466971 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-operator-scripts\") pod \"nova-cell1-cd3e-account-create-update-dkxt7\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.467010 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcd4c\" (UniqueName: \"kubernetes.io/projected/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-kube-api-access-gcd4c\") pod \"nova-cell1-cd3e-account-create-update-dkxt7\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.469270 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-operator-scripts\") pod \"nova-cell0-141e-account-create-update-msmcr\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.500150 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqc85\" (UniqueName: \"kubernetes.io/projected/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-kube-api-access-cqc85\") pod \"nova-cell0-141e-account-create-update-msmcr\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.575531 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-operator-scripts\") pod \"nova-cell1-cd3e-account-create-update-dkxt7\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.575598 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcd4c\" (UniqueName: \"kubernetes.io/projected/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-kube-api-access-gcd4c\") pod \"nova-cell1-cd3e-account-create-update-dkxt7\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.577166 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.584826 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-operator-scripts\") pod \"nova-cell1-cd3e-account-create-update-dkxt7\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.610920 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcd4c\" (UniqueName: \"kubernetes.io/projected/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-kube-api-access-gcd4c\") pod \"nova-cell1-cd3e-account-create-update-dkxt7\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.675307 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.829353 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kcmlj"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.984698 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9245-account-create-update-j6hsh"] Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.059180 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9245-account-create-update-j6hsh" event={"ID":"6795d880-5f00-4be4-9c67-6f8a251550cb","Type":"ContainerStarted","Data":"e6bf1e81e83109b20ee78a4698c5f46cec050fb71165f48f7b679ef30e434cbe"} Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.070086 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qx8nd"] Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.080345 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lgv6f"] Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.090606 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kcmlj" event={"ID":"7f0e335a-e4a1-48ee-b470-a6277acc5dae","Type":"ContainerStarted","Data":"35c016eb10e43ba219fe0c2064520734a36f7363a214ff58f2ffde62c09da07b"} Feb 27 00:28:31 crc kubenswrapper[4781]: W0227 00:28:31.097665 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7468389a_cc9b_404c_9414_4d81f3b1a7e5.slice/crio-5576b81dd5904dbe30bed1665d83072721616cae57d7d91a0cf5669aa863c1c1 WatchSource:0}: Error finding container 5576b81dd5904dbe30bed1665d83072721616cae57d7d91a0cf5669aa863c1c1: Status 404 returned error can't find the container with id 5576b81dd5904dbe30bed1665d83072721616cae57d7d91a0cf5669aa863c1c1 Feb 27 00:28:31 crc kubenswrapper[4781]: W0227 00:28:31.102379 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b9af6a0_49e8_462c_80d6_df8a3d3bd4ce.slice/crio-8ff62ce7ed0f99a13dbdc14aa1d4d0850ddd0de4549c511924fdd434d0ff9998 WatchSource:0}: Error finding container 8ff62ce7ed0f99a13dbdc14aa1d4d0850ddd0de4549c511924fdd434d0ff9998: Status 404 returned error can't find the container with id 8ff62ce7ed0f99a13dbdc14aa1d4d0850ddd0de4549c511924fdd434d0ff9998 Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.286200 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cd3e-account-create-update-dkxt7"] Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.300349 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-141e-account-create-update-msmcr"] Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.434962 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.443737 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.109792 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" event={"ID":"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0","Type":"ContainerStarted","Data":"f0ed84118ee5cf9c2ed24eb79c583f0f931f25971746f3f5ae7f1d86952188c8"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.114880 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.115934 4781 generic.go:334] "Generic (PLEG): container finished" podID="5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" containerID="24536e1e89dfec02307e517e9566052e3516ec64369f8d65d2939b8e4650f889" exitCode=0 Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.116040 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lgv6f" event={"ID":"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce","Type":"ContainerDied","Data":"24536e1e89dfec02307e517e9566052e3516ec64369f8d65d2939b8e4650f889"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.116074 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lgv6f" event={"ID":"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce","Type":"ContainerStarted","Data":"8ff62ce7ed0f99a13dbdc14aa1d4d0850ddd0de4549c511924fdd434d0ff9998"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.130396 4781 generic.go:334] "Generic (PLEG): container finished" podID="6795d880-5f00-4be4-9c67-6f8a251550cb" containerID="e064657ef0c106a3592f283bb81ae42d2444dda1caced8f721f45cdcfe863108" exitCode=0 Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.130479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9245-account-create-update-j6hsh" event={"ID":"6795d880-5f00-4be4-9c67-6f8a251550cb","Type":"ContainerDied","Data":"e064657ef0c106a3592f283bb81ae42d2444dda1caced8f721f45cdcfe863108"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.131959 4781 generic.go:334] "Generic (PLEG): container finished" podID="7468389a-cc9b-404c-9414-4d81f3b1a7e5" containerID="12e5844f351b3d039dc82ba98df27afa29e4eaea9f5b2ec45b3c8cb5d018e0ca" exitCode=0 Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.132071 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qx8nd" event={"ID":"7468389a-cc9b-404c-9414-4d81f3b1a7e5","Type":"ContainerDied","Data":"12e5844f351b3d039dc82ba98df27afa29e4eaea9f5b2ec45b3c8cb5d018e0ca"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.132123 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qx8nd" event={"ID":"7468389a-cc9b-404c-9414-4d81f3b1a7e5","Type":"ContainerStarted","Data":"5576b81dd5904dbe30bed1665d83072721616cae57d7d91a0cf5669aa863c1c1"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.133657 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-141e-account-create-update-msmcr" event={"ID":"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5","Type":"ContainerStarted","Data":"3f0c43c1a4e5a5291167e1fa7fd5751b434133130a3345ffad218933d1ced585"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.137299 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f0e335a-e4a1-48ee-b470-a6277acc5dae" containerID="ae3d06d551b95e82732253f74b171a292fd2201889c2e3a5a620c3b16fb394dd" exitCode=0 Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.137347 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kcmlj" event={"ID":"7f0e335a-e4a1-48ee-b470-a6277acc5dae","Type":"ContainerDied","Data":"ae3d06d551b95e82732253f74b171a292fd2201889c2e3a5a620c3b16fb394dd"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.137440 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.148269 4781 generic.go:334] "Generic (PLEG): container finished" podID="2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" containerID="feae0a2cae038402fdacbd138e93b4a28e83ea37dfdf069227fa89f2c8eea228" exitCode=0 Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.148358 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" event={"ID":"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0","Type":"ContainerDied","Data":"feae0a2cae038402fdacbd138e93b4a28e83ea37dfdf069227fa89f2c8eea228"} Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.151657 4781 generic.go:334] "Generic (PLEG): container finished" podID="c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" containerID="a7acf67e842e66e4a577e00cfd7561f83ca973cea54d959ed8fb7c9427da2a89" exitCode=0 Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.151735 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-141e-account-create-update-msmcr" event={"ID":"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5","Type":"ContainerDied","Data":"a7acf67e842e66e4a577e00cfd7561f83ca973cea54d959ed8fb7c9427da2a89"} Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.154771 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerStarted","Data":"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7"} Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.216124 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.46270243 podStartE2EDuration="8.216102889s" podCreationTimestamp="2026-02-27 00:28:25 +0000 UTC" firstStartedPulling="2026-02-27 00:28:27.052133282 +0000 UTC m=+1376.309672836" lastFinishedPulling="2026-02-27 00:28:31.805533741 +0000 UTC m=+1381.063073295" observedRunningTime="2026-02-27 00:28:33.208811742 +0000 UTC m=+1382.466351316" watchObservedRunningTime="2026-02-27 00:28:33.216102889 +0000 UTC m=+1382.473642443" Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.738455 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.849437 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gw86\" (UniqueName: \"kubernetes.io/projected/7f0e335a-e4a1-48ee-b470-a6277acc5dae-kube-api-access-9gw86\") pod \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.849503 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0e335a-e4a1-48ee-b470-a6277acc5dae-operator-scripts\") pod \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.850876 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f0e335a-e4a1-48ee-b470-a6277acc5dae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f0e335a-e4a1-48ee-b470-a6277acc5dae" (UID: "7f0e335a-e4a1-48ee-b470-a6277acc5dae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.856028 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f0e335a-e4a1-48ee-b470-a6277acc5dae-kube-api-access-9gw86" (OuterVolumeSpecName: "kube-api-access-9gw86") pod "7f0e335a-e4a1-48ee-b470-a6277acc5dae" (UID: "7f0e335a-e4a1-48ee-b470-a6277acc5dae"). InnerVolumeSpecName "kube-api-access-9gw86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.951994 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gw86\" (UniqueName: \"kubernetes.io/projected/7f0e335a-e4a1-48ee-b470-a6277acc5dae-kube-api-access-9gw86\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.952351 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0e335a-e4a1-48ee-b470-a6277acc5dae-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.022241 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.031269 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.036576 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.154531 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb5sg\" (UniqueName: \"kubernetes.io/projected/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-kube-api-access-nb5sg\") pod \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.154711 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6795d880-5f00-4be4-9c67-6f8a251550cb-operator-scripts\") pod \"6795d880-5f00-4be4-9c67-6f8a251550cb\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.154733 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt8x7\" (UniqueName: \"kubernetes.io/projected/6795d880-5f00-4be4-9c67-6f8a251550cb-kube-api-access-vt8x7\") pod \"6795d880-5f00-4be4-9c67-6f8a251550cb\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.154750 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc7ql\" (UniqueName: \"kubernetes.io/projected/7468389a-cc9b-404c-9414-4d81f3b1a7e5-kube-api-access-sc7ql\") pod \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.154898 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-operator-scripts\") pod \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.154986 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7468389a-cc9b-404c-9414-4d81f3b1a7e5-operator-scripts\") pod \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.155349 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6795d880-5f00-4be4-9c67-6f8a251550cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6795d880-5f00-4be4-9c67-6f8a251550cb" (UID: "6795d880-5f00-4be4-9c67-6f8a251550cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.155536 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" (UID: "5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.155809 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7468389a-cc9b-404c-9414-4d81f3b1a7e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7468389a-cc9b-404c-9414-4d81f3b1a7e5" (UID: "7468389a-cc9b-404c-9414-4d81f3b1a7e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.160524 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6795d880-5f00-4be4-9c67-6f8a251550cb-kube-api-access-vt8x7" (OuterVolumeSpecName: "kube-api-access-vt8x7") pod "6795d880-5f00-4be4-9c67-6f8a251550cb" (UID: "6795d880-5f00-4be4-9c67-6f8a251550cb"). InnerVolumeSpecName "kube-api-access-vt8x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.160650 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7468389a-cc9b-404c-9414-4d81f3b1a7e5-kube-api-access-sc7ql" (OuterVolumeSpecName: "kube-api-access-sc7ql") pod "7468389a-cc9b-404c-9414-4d81f3b1a7e5" (UID: "7468389a-cc9b-404c-9414-4d81f3b1a7e5"). InnerVolumeSpecName "kube-api-access-sc7ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.160727 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-kube-api-access-nb5sg" (OuterVolumeSpecName: "kube-api-access-nb5sg") pod "5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" (UID: "5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce"). InnerVolumeSpecName "kube-api-access-nb5sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.213667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lgv6f" event={"ID":"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce","Type":"ContainerDied","Data":"8ff62ce7ed0f99a13dbdc14aa1d4d0850ddd0de4549c511924fdd434d0ff9998"} Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.213709 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ff62ce7ed0f99a13dbdc14aa1d4d0850ddd0de4549c511924fdd434d0ff9998" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.213787 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.222811 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9245-account-create-update-j6hsh" event={"ID":"6795d880-5f00-4be4-9c67-6f8a251550cb","Type":"ContainerDied","Data":"e6bf1e81e83109b20ee78a4698c5f46cec050fb71165f48f7b679ef30e434cbe"} Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.222860 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6bf1e81e83109b20ee78a4698c5f46cec050fb71165f48f7b679ef30e434cbe" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.222890 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.224847 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qx8nd" event={"ID":"7468389a-cc9b-404c-9414-4d81f3b1a7e5","Type":"ContainerDied","Data":"5576b81dd5904dbe30bed1665d83072721616cae57d7d91a0cf5669aa863c1c1"} Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.224891 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5576b81dd5904dbe30bed1665d83072721616cae57d7d91a0cf5669aa863c1c1" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.224971 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.232380 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.232693 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kcmlj" event={"ID":"7f0e335a-e4a1-48ee-b470-a6277acc5dae","Type":"ContainerDied","Data":"35c016eb10e43ba219fe0c2064520734a36f7363a214ff58f2ffde62c09da07b"} Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.232737 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35c016eb10e43ba219fe0c2064520734a36f7363a214ff58f2ffde62c09da07b" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.232980 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.257232 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6795d880-5f00-4be4-9c67-6f8a251550cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.257258 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt8x7\" (UniqueName: \"kubernetes.io/projected/6795d880-5f00-4be4-9c67-6f8a251550cb-kube-api-access-vt8x7\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.257269 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc7ql\" (UniqueName: \"kubernetes.io/projected/7468389a-cc9b-404c-9414-4d81f3b1a7e5-kube-api-access-sc7ql\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.257278 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.257287 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7468389a-cc9b-404c-9414-4d81f3b1a7e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.257296 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb5sg\" (UniqueName: \"kubernetes.io/projected/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-kube-api-access-nb5sg\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.590482 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.722450 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.766422 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqc85\" (UniqueName: \"kubernetes.io/projected/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-kube-api-access-cqc85\") pod \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.766570 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-operator-scripts\") pod \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.767440 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" (UID: "c2f8e017-da89-4ce0-a5b7-2339b2cf18a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.777989 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.779169 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.779949 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-kube-api-access-cqc85" (OuterVolumeSpecName: "kube-api-access-cqc85") pod "c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" (UID: "c2f8e017-da89-4ce0-a5b7-2339b2cf18a5"). InnerVolumeSpecName "kube-api-access-cqc85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.821227 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.833316 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.869002 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-operator-scripts\") pod \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.869952 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcd4c\" (UniqueName: \"kubernetes.io/projected/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-kube-api-access-gcd4c\") pod \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.869397 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" (UID: "2b4dbafa-fefb-4947-8d71-f7b0057a2ba0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.870978 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.871003 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqc85\" (UniqueName: \"kubernetes.io/projected/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-kube-api-access-cqc85\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.871020 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.878939 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-kube-api-access-gcd4c" (OuterVolumeSpecName: "kube-api-access-gcd4c") pod "2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" (UID: "2b4dbafa-fefb-4947-8d71-f7b0057a2ba0"). InnerVolumeSpecName "kube-api-access-gcd4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.972825 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcd4c\" (UniqueName: \"kubernetes.io/projected/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-kube-api-access-gcd4c\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.244206 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-141e-account-create-update-msmcr" event={"ID":"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5","Type":"ContainerDied","Data":"3f0c43c1a4e5a5291167e1fa7fd5751b434133130a3345ffad218933d1ced585"} Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.244257 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f0c43c1a4e5a5291167e1fa7fd5751b434133130a3345ffad218933d1ced585" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.245055 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.245994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" event={"ID":"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0","Type":"ContainerDied","Data":"f0ed84118ee5cf9c2ed24eb79c583f0f931f25971746f3f5ae7f1d86952188c8"} Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.246031 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0ed84118ee5cf9c2ed24eb79c583f0f931f25971746f3f5ae7f1d86952188c8" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.246037 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.246603 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.246735 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.604840 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.902667 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.902930 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.937341 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.955808 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 00:28:36 crc kubenswrapper[4781]: I0227 00:28:36.258365 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 00:28:36 crc kubenswrapper[4781]: I0227 00:28:36.258528 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-central-agent" containerID="cri-o://3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" gracePeriod=30 Feb 27 00:28:36 crc kubenswrapper[4781]: I0227 00:28:36.258620 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="proxy-httpd" containerID="cri-o://5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" gracePeriod=30 Feb 27 00:28:36 crc kubenswrapper[4781]: I0227 00:28:36.258641 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="sg-core" containerID="cri-o://e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" gracePeriod=30 Feb 27 00:28:36 crc kubenswrapper[4781]: I0227 00:28:36.258817 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 00:28:36 crc kubenswrapper[4781]: I0227 00:28:36.258860 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-notification-agent" containerID="cri-o://f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" gracePeriod=30 Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.191600 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270299 4781 generic.go:334] "Generic (PLEG): container finished" podID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" exitCode=0 Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270333 4781 generic.go:334] "Generic (PLEG): container finished" podID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" exitCode=2 Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270340 4781 generic.go:334] "Generic (PLEG): container finished" podID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" exitCode=0 Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270347 4781 generic.go:334] "Generic (PLEG): container finished" podID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" exitCode=0 Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270366 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270400 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerDied","Data":"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7"} Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270441 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerDied","Data":"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd"} Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270452 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerDied","Data":"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8"} Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270463 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerDied","Data":"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486"} Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270471 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerDied","Data":"a513fdece7b001c5868cd23c79b7671562961e01d9db2105289006ccfa1d5641"} Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270486 4781 scope.go:117] "RemoveContainer" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.301244 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.301363 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.303586 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.308941 4781 scope.go:117] "RemoveContainer" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320283 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-run-httpd\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320331 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-log-httpd\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320427 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-config-data\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320454 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4sp2\" (UniqueName: \"kubernetes.io/projected/546554c7-b0b0-4363-b1f8-6f83d43562cc-kube-api-access-s4sp2\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320479 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-sg-core-conf-yaml\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320527 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-scripts\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320564 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-combined-ca-bundle\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.323284 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.327017 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.331065 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-scripts" (OuterVolumeSpecName: "scripts") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.339858 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/546554c7-b0b0-4363-b1f8-6f83d43562cc-kube-api-access-s4sp2" (OuterVolumeSpecName: "kube-api-access-s4sp2") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "kube-api-access-s4sp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.347841 4781 scope.go:117] "RemoveContainer" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.359835 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.426181 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.426490 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.426499 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4sp2\" (UniqueName: \"kubernetes.io/projected/546554c7-b0b0-4363-b1f8-6f83d43562cc-kube-api-access-s4sp2\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.426509 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.426519 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.478912 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.527565 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.540907 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-config-data" (OuterVolumeSpecName: "config-data") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.624988 4781 scope.go:117] "RemoveContainer" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.631048 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.632738 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.657041 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.665070 4781 scope.go:117] "RemoveContainer" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.668210 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": container with ID starting with 5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7 not found: ID does not exist" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.668255 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7"} err="failed to get container status \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": rpc error: code = NotFound desc = could not find container \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": container with ID starting with 5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.668280 4781 scope.go:117] "RemoveContainer" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675248 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675719 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-notification-agent" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675740 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-notification-agent" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675752 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7468389a-cc9b-404c-9414-4d81f3b1a7e5" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675758 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7468389a-cc9b-404c-9414-4d81f3b1a7e5" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675770 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675777 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675803 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="proxy-httpd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675811 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="proxy-httpd" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675822 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-central-agent" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675830 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-central-agent" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675845 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f0e335a-e4a1-48ee-b470-a6277acc5dae" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675850 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f0e335a-e4a1-48ee-b470-a6277acc5dae" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675860 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="sg-core" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675866 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="sg-core" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675873 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6795d880-5f00-4be4-9c67-6f8a251550cb" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675879 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6795d880-5f00-4be4-9c67-6f8a251550cb" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675891 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675897 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675904 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675910 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676124 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7468389a-cc9b-404c-9414-4d81f3b1a7e5" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676141 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="proxy-httpd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676160 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676174 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="sg-core" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676183 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f0e335a-e4a1-48ee-b470-a6277acc5dae" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676199 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676212 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6795d880-5f00-4be4-9c67-6f8a251550cb" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676226 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676238 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-notification-agent" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676246 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-central-agent" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.676433 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": container with ID starting with e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd not found: ID does not exist" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676468 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd"} err="failed to get container status \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": rpc error: code = NotFound desc = could not find container \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": container with ID starting with e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676494 4781 scope.go:117] "RemoveContainer" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.678021 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.681743 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": container with ID starting with f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8 not found: ID does not exist" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.681782 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8"} err="failed to get container status \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": rpc error: code = NotFound desc = could not find container \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": container with ID starting with f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.681804 4781 scope.go:117] "RemoveContainer" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.686766 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": container with ID starting with 3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486 not found: ID does not exist" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.686803 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486"} err="failed to get container status \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": rpc error: code = NotFound desc = could not find container \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": container with ID starting with 3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.686825 4781 scope.go:117] "RemoveContainer" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.690701 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7"} err="failed to get container status \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": rpc error: code = NotFound desc = could not find container \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": container with ID starting with 5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.690725 4781 scope.go:117] "RemoveContainer" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.694874 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd"} err="failed to get container status \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": rpc error: code = NotFound desc = could not find container \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": container with ID starting with e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.694924 4781 scope.go:117] "RemoveContainer" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.700064 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8"} err="failed to get container status \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": rpc error: code = NotFound desc = could not find container \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": container with ID starting with f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.700113 4781 scope.go:117] "RemoveContainer" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.703857 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.704555 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.705802 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.708920 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486"} err="failed to get container status \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": rpc error: code = NotFound desc = could not find container \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": container with ID starting with 3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.708956 4781 scope.go:117] "RemoveContainer" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.711774 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7"} err="failed to get container status \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": rpc error: code = NotFound desc = could not find container \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": container with ID starting with 5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.711800 4781 scope.go:117] "RemoveContainer" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.713313 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd"} err="failed to get container status \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": rpc error: code = NotFound desc = could not find container \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": container with ID starting with e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.713349 4781 scope.go:117] "RemoveContainer" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.716922 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8"} err="failed to get container status \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": rpc error: code = NotFound desc = could not find container \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": container with ID starting with f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.716958 4781 scope.go:117] "RemoveContainer" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.717281 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486"} err="failed to get container status \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": rpc error: code = NotFound desc = could not find container \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": container with ID starting with 3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.717298 4781 scope.go:117] "RemoveContainer" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.717546 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7"} err="failed to get container status \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": rpc error: code = NotFound desc = could not find container \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": container with ID starting with 5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.717564 4781 scope.go:117] "RemoveContainer" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.717838 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd"} err="failed to get container status \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": rpc error: code = NotFound desc = could not find container \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": container with ID starting with e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.717853 4781 scope.go:117] "RemoveContainer" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.718102 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8"} err="failed to get container status \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": rpc error: code = NotFound desc = could not find container \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": container with ID starting with f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.718117 4781 scope.go:117] "RemoveContainer" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.718339 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486"} err="failed to get container status \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": rpc error: code = NotFound desc = could not find container \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": container with ID starting with 3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.834179 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-scripts\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.834252 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.835047 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-config-data\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.835195 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hs5n\" (UniqueName: \"kubernetes.io/projected/6c0d1328-b565-4c9e-a9dc-e7b863568260-kube-api-access-2hs5n\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.835306 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.835458 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-log-httpd\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.835622 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-run-httpd\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.937928 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-run-httpd\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.937993 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-scripts\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938030 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938068 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-config-data\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938104 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hs5n\" (UniqueName: \"kubernetes.io/projected/6c0d1328-b565-4c9e-a9dc-e7b863568260-kube-api-access-2hs5n\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938123 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938149 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-log-httpd\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938654 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-log-httpd\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938866 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-run-httpd\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.944440 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-config-data\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.944830 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.946184 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.948293 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-scripts\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.960281 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hs5n\" (UniqueName: \"kubernetes.io/projected/6c0d1328-b565-4c9e-a9dc-e7b863568260-kube-api-access-2hs5n\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.995185 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.004416 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.040438 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.134576 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76c479bbf8-lkpd7"] Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.134818 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76c479bbf8-lkpd7" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-log" containerID="cri-o://412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8" gracePeriod=30 Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.135224 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76c479bbf8-lkpd7" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-api" containerID="cri-o://6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68" gracePeriod=30 Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.292280 4781 generic.go:334] "Generic (PLEG): container finished" podID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerID="412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8" exitCode=143 Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.292697 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c479bbf8-lkpd7" event={"ID":"33c297e1-af3e-46d6-9738-8e6833deaf02","Type":"ContainerDied","Data":"412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8"} Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.549151 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.580954 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.651184 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.651276 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.797266 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 00:28:39 crc kubenswrapper[4781]: I0227 00:28:39.304329 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerStarted","Data":"7a5345b65b014bc9d0e2cd844013d91d1a91d4e408c41f0c7f4f964de80130f6"} Feb 27 00:28:39 crc kubenswrapper[4781]: I0227 00:28:39.304649 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerStarted","Data":"72f31190f456d76c84b55f85362321bc4ca382df7c3a1c86e9e23616be0d7246"} Feb 27 00:28:39 crc kubenswrapper[4781]: I0227 00:28:39.321692 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" path="/var/lib/kubelet/pods/546554c7-b0b0-4363-b1f8-6f83d43562cc/volumes" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.317083 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerStarted","Data":"16ba8a242e20589655027929d1c82fa25c3d9fc988018051237357efea8a8ec9"} Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.442749 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9cntr"] Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.448521 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.451613 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.451913 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.455030 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-lsptr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.463311 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9cntr"] Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.607642 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksn6l\" (UniqueName: \"kubernetes.io/projected/d71a5c1e-7953-4acf-813a-0d96d4992d1f-kube-api-access-ksn6l\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.607706 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-config-data\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.607743 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-scripts\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.607883 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.710114 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksn6l\" (UniqueName: \"kubernetes.io/projected/d71a5c1e-7953-4acf-813a-0d96d4992d1f-kube-api-access-ksn6l\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.710162 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-config-data\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.710187 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-scripts\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.710245 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.717356 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-config-data\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.717702 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.721065 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-scripts\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.741203 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksn6l\" (UniqueName: \"kubernetes.io/projected/d71a5c1e-7953-4acf-813a-0d96d4992d1f-kube-api-access-ksn6l\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.850963 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.343521 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerStarted","Data":"9155e1f68a6370d2a59d952aff96914080df4756a62f18bb9bbc3ec507e49ef4"} Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.420565 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9cntr"] Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.756220 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.832508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-combined-ca-bundle\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.832914 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnnv4\" (UniqueName: \"kubernetes.io/projected/33c297e1-af3e-46d6-9738-8e6833deaf02-kube-api-access-jnnv4\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.832988 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-config-data\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.833006 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-internal-tls-certs\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.833072 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c297e1-af3e-46d6-9738-8e6833deaf02-logs\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.833092 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-public-tls-certs\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.833118 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-scripts\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.834672 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33c297e1-af3e-46d6-9738-8e6833deaf02-logs" (OuterVolumeSpecName: "logs") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.840548 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-scripts" (OuterVolumeSpecName: "scripts") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.840832 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33c297e1-af3e-46d6-9738-8e6833deaf02-kube-api-access-jnnv4" (OuterVolumeSpecName: "kube-api-access-jnnv4") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "kube-api-access-jnnv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.903888 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.935897 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnnv4\" (UniqueName: \"kubernetes.io/projected/33c297e1-af3e-46d6-9738-8e6833deaf02-kube-api-access-jnnv4\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.936040 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c297e1-af3e-46d6-9738-8e6833deaf02-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.936063 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.936072 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.941873 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-config-data" (OuterVolumeSpecName: "config-data") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.024304 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.026748 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.039201 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.039236 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.039280 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.384882 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9cntr" event={"ID":"d71a5c1e-7953-4acf-813a-0d96d4992d1f","Type":"ContainerStarted","Data":"b913bb52004e54e9c0de1dc5c1250761b05eab25edf15ed18eac691db4593cf7"} Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.387218 4781 generic.go:334] "Generic (PLEG): container finished" podID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerID="6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68" exitCode=0 Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.387248 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c479bbf8-lkpd7" event={"ID":"33c297e1-af3e-46d6-9738-8e6833deaf02","Type":"ContainerDied","Data":"6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68"} Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.387263 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c479bbf8-lkpd7" event={"ID":"33c297e1-af3e-46d6-9738-8e6833deaf02","Type":"ContainerDied","Data":"21b15cb407945a01adc26829ab99f15cd9c656e66d81cf610b3118b8b9526261"} Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.387278 4781 scope.go:117] "RemoveContainer" containerID="6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.387400 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.413172 4781 scope.go:117] "RemoveContainer" containerID="412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.431688 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76c479bbf8-lkpd7"] Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.449628 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-76c479bbf8-lkpd7"] Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.456654 4781 scope.go:117] "RemoveContainer" containerID="6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68" Feb 27 00:28:42 crc kubenswrapper[4781]: E0227 00:28:42.460788 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68\": container with ID starting with 6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68 not found: ID does not exist" containerID="6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.460845 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68"} err="failed to get container status \"6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68\": rpc error: code = NotFound desc = could not find container \"6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68\": container with ID starting with 6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68 not found: ID does not exist" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.460869 4781 scope.go:117] "RemoveContainer" containerID="412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8" Feb 27 00:28:42 crc kubenswrapper[4781]: E0227 00:28:42.461364 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8\": container with ID starting with 412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8 not found: ID does not exist" containerID="412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.461412 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8"} err="failed to get container status \"412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8\": rpc error: code = NotFound desc = could not find container \"412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8\": container with ID starting with 412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8 not found: ID does not exist" Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.023164 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.323950 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" path="/var/lib/kubelet/pods/33c297e1-af3e-46d6-9738-8e6833deaf02/volumes" Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.412235 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerStarted","Data":"cfcdb38663d80d12b7e86a05dfe2ce7cc23ff17e6af4e336ba2f0e4a180806c3"} Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.412416 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-central-agent" containerID="cri-o://7a5345b65b014bc9d0e2cd844013d91d1a91d4e408c41f0c7f4f964de80130f6" gracePeriod=30 Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.412820 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.413212 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="proxy-httpd" containerID="cri-o://cfcdb38663d80d12b7e86a05dfe2ce7cc23ff17e6af4e336ba2f0e4a180806c3" gracePeriod=30 Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.413283 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="sg-core" containerID="cri-o://9155e1f68a6370d2a59d952aff96914080df4756a62f18bb9bbc3ec507e49ef4" gracePeriod=30 Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.413341 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-notification-agent" containerID="cri-o://16ba8a242e20589655027929d1c82fa25c3d9fc988018051237357efea8a8ec9" gracePeriod=30 Feb 27 00:28:44 crc kubenswrapper[4781]: I0227 00:28:44.424789 4781 generic.go:334] "Generic (PLEG): container finished" podID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerID="9155e1f68a6370d2a59d952aff96914080df4756a62f18bb9bbc3ec507e49ef4" exitCode=2 Feb 27 00:28:44 crc kubenswrapper[4781]: I0227 00:28:44.425122 4781 generic.go:334] "Generic (PLEG): container finished" podID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerID="16ba8a242e20589655027929d1c82fa25c3d9fc988018051237357efea8a8ec9" exitCode=0 Feb 27 00:28:44 crc kubenswrapper[4781]: I0227 00:28:44.424881 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerDied","Data":"9155e1f68a6370d2a59d952aff96914080df4756a62f18bb9bbc3ec507e49ef4"} Feb 27 00:28:44 crc kubenswrapper[4781]: I0227 00:28:44.425169 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerDied","Data":"16ba8a242e20589655027929d1c82fa25c3d9fc988018051237357efea8a8ec9"} Feb 27 00:28:51 crc kubenswrapper[4781]: I0227 00:28:51.344008 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=9.822551053 podStartE2EDuration="14.343982805s" podCreationTimestamp="2026-02-27 00:28:37 +0000 UTC" firstStartedPulling="2026-02-27 00:28:38.551308606 +0000 UTC m=+1387.808848150" lastFinishedPulling="2026-02-27 00:28:43.072740348 +0000 UTC m=+1392.330279902" observedRunningTime="2026-02-27 00:28:43.441425818 +0000 UTC m=+1392.698965372" watchObservedRunningTime="2026-02-27 00:28:51.343982805 +0000 UTC m=+1400.601522369" Feb 27 00:28:51 crc kubenswrapper[4781]: I0227 00:28:51.502050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9cntr" event={"ID":"d71a5c1e-7953-4acf-813a-0d96d4992d1f","Type":"ContainerStarted","Data":"a4bad047d90bd3b11bea212cddee0782007013387656451beeca5b44aee50150"} Feb 27 00:28:51 crc kubenswrapper[4781]: I0227 00:28:51.519524 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-9cntr" podStartSLOduration=2.37584504 podStartE2EDuration="11.519507997s" podCreationTimestamp="2026-02-27 00:28:40 +0000 UTC" firstStartedPulling="2026-02-27 00:28:41.426818341 +0000 UTC m=+1390.684357895" lastFinishedPulling="2026-02-27 00:28:50.570481298 +0000 UTC m=+1399.828020852" observedRunningTime="2026-02-27 00:28:51.515736415 +0000 UTC m=+1400.773275969" watchObservedRunningTime="2026-02-27 00:28:51.519507997 +0000 UTC m=+1400.777047551" Feb 27 00:28:53 crc kubenswrapper[4781]: I0227 00:28:53.525820 4781 generic.go:334] "Generic (PLEG): container finished" podID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerID="7a5345b65b014bc9d0e2cd844013d91d1a91d4e408c41f0c7f4f964de80130f6" exitCode=0 Feb 27 00:28:53 crc kubenswrapper[4781]: I0227 00:28:53.525874 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerDied","Data":"7a5345b65b014bc9d0e2cd844013d91d1a91d4e408c41f0c7f4f964de80130f6"} Feb 27 00:29:01 crc kubenswrapper[4781]: I0227 00:29:01.622373 4781 generic.go:334] "Generic (PLEG): container finished" podID="d71a5c1e-7953-4acf-813a-0d96d4992d1f" containerID="a4bad047d90bd3b11bea212cddee0782007013387656451beeca5b44aee50150" exitCode=0 Feb 27 00:29:01 crc kubenswrapper[4781]: I0227 00:29:01.622479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9cntr" event={"ID":"d71a5c1e-7953-4acf-813a-0d96d4992d1f","Type":"ContainerDied","Data":"a4bad047d90bd3b11bea212cddee0782007013387656451beeca5b44aee50150"} Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.093459 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.195362 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksn6l\" (UniqueName: \"kubernetes.io/projected/d71a5c1e-7953-4acf-813a-0d96d4992d1f-kube-api-access-ksn6l\") pod \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.195404 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-combined-ca-bundle\") pod \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.195530 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-scripts\") pod \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.195594 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-config-data\") pod \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.201803 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d71a5c1e-7953-4acf-813a-0d96d4992d1f-kube-api-access-ksn6l" (OuterVolumeSpecName: "kube-api-access-ksn6l") pod "d71a5c1e-7953-4acf-813a-0d96d4992d1f" (UID: "d71a5c1e-7953-4acf-813a-0d96d4992d1f"). InnerVolumeSpecName "kube-api-access-ksn6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.201943 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-scripts" (OuterVolumeSpecName: "scripts") pod "d71a5c1e-7953-4acf-813a-0d96d4992d1f" (UID: "d71a5c1e-7953-4acf-813a-0d96d4992d1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.226678 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d71a5c1e-7953-4acf-813a-0d96d4992d1f" (UID: "d71a5c1e-7953-4acf-813a-0d96d4992d1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.227715 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-config-data" (OuterVolumeSpecName: "config-data") pod "d71a5c1e-7953-4acf-813a-0d96d4992d1f" (UID: "d71a5c1e-7953-4acf-813a-0d96d4992d1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.297249 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksn6l\" (UniqueName: \"kubernetes.io/projected/d71a5c1e-7953-4acf-813a-0d96d4992d1f-kube-api-access-ksn6l\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.297277 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.297287 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.297295 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.643454 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9cntr" event={"ID":"d71a5c1e-7953-4acf-813a-0d96d4992d1f","Type":"ContainerDied","Data":"b913bb52004e54e9c0de1dc5c1250761b05eab25edf15ed18eac691db4593cf7"} Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.643792 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b913bb52004e54e9c0de1dc5c1250761b05eab25edf15ed18eac691db4593cf7" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.643519 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.745830 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 27 00:29:03 crc kubenswrapper[4781]: E0227 00:29:03.746312 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-log" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.746329 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-log" Feb 27 00:29:03 crc kubenswrapper[4781]: E0227 00:29:03.746350 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-api" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.746357 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-api" Feb 27 00:29:03 crc kubenswrapper[4781]: E0227 00:29:03.746376 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71a5c1e-7953-4acf-813a-0d96d4992d1f" containerName="nova-cell0-conductor-db-sync" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.746382 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71a5c1e-7953-4acf-813a-0d96d4992d1f" containerName="nova-cell0-conductor-db-sync" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.746569 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-log" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.746589 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-api" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.746612 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71a5c1e-7953-4acf-813a-0d96d4992d1f" containerName="nova-cell0-conductor-db-sync" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.747355 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.750019 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.754972 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-lsptr" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.756480 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.909824 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fvr6\" (UniqueName: \"kubernetes.io/projected/7503d0a7-eca6-4d15-9538-9cded970acc2-kube-api-access-5fvr6\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.910109 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7503d0a7-eca6-4d15-9538-9cded970acc2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.910259 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7503d0a7-eca6-4d15-9538-9cded970acc2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.012070 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7503d0a7-eca6-4d15-9538-9cded970acc2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.012234 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fvr6\" (UniqueName: \"kubernetes.io/projected/7503d0a7-eca6-4d15-9538-9cded970acc2-kube-api-access-5fvr6\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.012328 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7503d0a7-eca6-4d15-9538-9cded970acc2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.018618 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7503d0a7-eca6-4d15-9538-9cded970acc2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.018711 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7503d0a7-eca6-4d15-9538-9cded970acc2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.047253 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fvr6\" (UniqueName: \"kubernetes.io/projected/7503d0a7-eca6-4d15-9538-9cded970acc2-kube-api-access-5fvr6\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.064123 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.530868 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.657825 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7503d0a7-eca6-4d15-9538-9cded970acc2","Type":"ContainerStarted","Data":"51ce2d4968b5aa25bc43cb9a6c14264106f6edeb7b625171dbf676e0c936523b"} Feb 27 00:29:05 crc kubenswrapper[4781]: I0227 00:29:05.669167 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7503d0a7-eca6-4d15-9538-9cded970acc2","Type":"ContainerStarted","Data":"829d4073c292cda6b13ce3bcf1e5167716db1791de3771bbdb28e0917b02ba8b"} Feb 27 00:29:05 crc kubenswrapper[4781]: I0227 00:29:05.669712 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:05 crc kubenswrapper[4781]: I0227 00:29:05.699915 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.699882075 podStartE2EDuration="2.699882075s" podCreationTimestamp="2026-02-27 00:29:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:05.683791421 +0000 UTC m=+1414.941330995" watchObservedRunningTime="2026-02-27 00:29:05.699882075 +0000 UTC m=+1414.957421669" Feb 27 00:29:08 crc kubenswrapper[4781]: I0227 00:29:08.009229 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.099926 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.622066 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-qjkwv"] Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.623658 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.626275 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.626686 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.632177 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qjkwv"] Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.733272 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-config-data\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.733663 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-scripts\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.733725 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.733758 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7flr\" (UniqueName: \"kubernetes.io/projected/cd521dc6-4126-4c51-8634-66db8ba1412e-kube-api-access-f7flr\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.835734 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-config-data\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.835775 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-scripts\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.835826 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.835848 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7flr\" (UniqueName: \"kubernetes.io/projected/cd521dc6-4126-4c51-8634-66db8ba1412e-kube-api-access-f7flr\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.847356 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-config-data\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.848245 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.866984 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-scripts\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.869923 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.871770 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.876577 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.884714 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.887467 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.898097 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.924720 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7flr\" (UniqueName: \"kubernetes.io/projected/cd521dc6-4126-4c51-8634-66db8ba1412e-kube-api-access-f7flr\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.929703 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.946219 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.948094 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.025762 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.028817 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.031348 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.062833 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7gxp\" (UniqueName: \"kubernetes.io/projected/902efa6b-d07e-4589-b6e6-8016dfdbcd57-kube-api-access-x7gxp\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063194 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-config-data\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063222 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/809e4ffe-9885-43b8-bb34-b748437f1bb9-logs\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063247 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-config-data\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063293 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrbkq\" (UniqueName: \"kubernetes.io/projected/809e4ffe-9885-43b8-bb34-b748437f1bb9-kube-api-access-qrbkq\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063390 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063454 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902efa6b-d07e-4589-b6e6-8016dfdbcd57-logs\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.166200 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.168601 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.168676 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-config-data\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.168779 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902efa6b-d07e-4589-b6e6-8016dfdbcd57-logs\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.168988 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7gxp\" (UniqueName: \"kubernetes.io/projected/902efa6b-d07e-4589-b6e6-8016dfdbcd57-kube-api-access-x7gxp\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169017 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169070 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-config-data\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169091 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ccw8\" (UniqueName: \"kubernetes.io/projected/de057148-8197-4717-bbcc-636e6d64344a-kube-api-access-9ccw8\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169130 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/809e4ffe-9885-43b8-bb34-b748437f1bb9-logs\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169151 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-config-data\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169212 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrbkq\" (UniqueName: \"kubernetes.io/projected/809e4ffe-9885-43b8-bb34-b748437f1bb9-kube-api-access-qrbkq\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169239 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169992 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902efa6b-d07e-4589-b6e6-8016dfdbcd57-logs\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.170415 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/809e4ffe-9885-43b8-bb34-b748437f1bb9-logs\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.176073 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.183938 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-config-data\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.189350 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-config-data\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.203560 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrbkq\" (UniqueName: \"kubernetes.io/projected/809e4ffe-9885-43b8-bb34-b748437f1bb9-kube-api-access-qrbkq\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.208061 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.210347 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.210646 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7gxp\" (UniqueName: \"kubernetes.io/projected/902efa6b-d07e-4589-b6e6-8016dfdbcd57-kube-api-access-x7gxp\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.211843 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.216222 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.251000 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.270848 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.270903 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ccw8\" (UniqueName: \"kubernetes.io/projected/de057148-8197-4717-bbcc-636e6d64344a-kube-api-access-9ccw8\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.271005 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-config-data\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.276304 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-config-data\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.278096 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cd565959-l4cw7"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.279923 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.280192 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.299832 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ccw8\" (UniqueName: \"kubernetes.io/projected/de057148-8197-4717-bbcc-636e6d64344a-kube-api-access-9ccw8\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.320151 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-l4cw7"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.373729 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-config\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.373871 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9vk6\" (UniqueName: \"kubernetes.io/projected/5f47f2d5-f4d5-448d-9355-ebe37959b584-kube-api-access-f9vk6\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.373904 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.373942 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxdkk\" (UniqueName: \"kubernetes.io/projected/8d780ba2-9829-430e-9a56-0b5b052bfbb7-kube-api-access-bxdkk\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.373967 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.374152 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-svc\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.374177 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.374213 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.374239 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.462427 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476170 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-svc\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476211 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476234 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476250 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476284 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-config\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476344 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9vk6\" (UniqueName: \"kubernetes.io/projected/5f47f2d5-f4d5-448d-9355-ebe37959b584-kube-api-access-f9vk6\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476364 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476391 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxdkk\" (UniqueName: \"kubernetes.io/projected/8d780ba2-9829-430e-9a56-0b5b052bfbb7-kube-api-access-bxdkk\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476407 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.478004 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.479239 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-svc\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.479757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.481688 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-config\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.486974 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.490311 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.492280 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.504025 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.511693 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9vk6\" (UniqueName: \"kubernetes.io/projected/5f47f2d5-f4d5-448d-9355-ebe37959b584-kube-api-access-f9vk6\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.519154 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.525248 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxdkk\" (UniqueName: \"kubernetes.io/projected/8d780ba2-9829-430e-9a56-0b5b052bfbb7-kube-api-access-bxdkk\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.555948 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.609456 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.652833 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qjkwv"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.832419 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qjkwv" event={"ID":"cd521dc6-4126-4c51-8634-66db8ba1412e","Type":"ContainerStarted","Data":"6b1e78ae032b9557d03ea57a421dc5b2962405bd66d1c8415a0c89f4e9888284"} Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.367374 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tg9k8"] Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.369173 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.373769 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.374039 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.380308 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tg9k8"] Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.500796 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-scripts\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.500868 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.500950 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-config-data\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.501036 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7dh2\" (UniqueName: \"kubernetes.io/projected/b607db2c-2aa3-48f0-9cd8-c5461797431c-kube-api-access-f7dh2\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.529554 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.565363 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.603918 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-config-data\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.604026 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dh2\" (UniqueName: \"kubernetes.io/projected/b607db2c-2aa3-48f0-9cd8-c5461797431c-kube-api-access-f7dh2\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.604150 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-scripts\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.604187 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.619979 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.637287 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-scripts\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: W0227 00:29:11.637394 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d780ba2_9829_430e_9a56_0b5b052bfbb7.slice/crio-c962c4b95a57d2e6b554d146c58ed360df05b5f45a673155452c828cacdad50b WatchSource:0}: Error finding container c962c4b95a57d2e6b554d146c58ed360df05b5f45a673155452c828cacdad50b: Status 404 returned error can't find the container with id c962c4b95a57d2e6b554d146c58ed360df05b5f45a673155452c828cacdad50b Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.638269 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-config-data\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.638839 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7dh2\" (UniqueName: \"kubernetes.io/projected/b607db2c-2aa3-48f0-9cd8-c5461797431c-kube-api-access-f7dh2\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.639721 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.656188 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.709197 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.874327 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8d780ba2-9829-430e-9a56-0b5b052bfbb7","Type":"ContainerStarted","Data":"c962c4b95a57d2e6b554d146c58ed360df05b5f45a673155452c828cacdad50b"} Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.879260 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qjkwv" event={"ID":"cd521dc6-4126-4c51-8634-66db8ba1412e","Type":"ContainerStarted","Data":"c9388f02af5b31dc8f5e8ea62ee66fb19cbab695e94e5d03ed46c036e292ce69"} Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.893812 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809e4ffe-9885-43b8-bb34-b748437f1bb9","Type":"ContainerStarted","Data":"3a3cfa9569cf1e101c985b875f586bf5df5c1e9c190016bf01cb0461f1a4b9c8"} Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.902578 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"902efa6b-d07e-4589-b6e6-8016dfdbcd57","Type":"ContainerStarted","Data":"10dbcf9aa331b09eb162dae4f7eb67ae5890ce7956c09aaa8725da5e211a8996"} Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.904153 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de057148-8197-4717-bbcc-636e6d64344a","Type":"ContainerStarted","Data":"5ecdf1c41abef4437c80f6d85c04db80a9d6858579c757ef6823795e81d59b23"} Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.912533 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-qjkwv" podStartSLOduration=2.912511527 podStartE2EDuration="2.912511527s" podCreationTimestamp="2026-02-27 00:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:11.902702703 +0000 UTC m=+1421.160242257" watchObservedRunningTime="2026-02-27 00:29:11.912511527 +0000 UTC m=+1421.170051091" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.988900 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-l4cw7"] Feb 27 00:29:12 crc kubenswrapper[4781]: W0227 00:29:12.288054 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb607db2c_2aa3_48f0_9cd8_c5461797431c.slice/crio-34decb527184bd4ff609a070de28e4da18ec18094d195f97919d2454806f58d8 WatchSource:0}: Error finding container 34decb527184bd4ff609a070de28e4da18ec18094d195f97919d2454806f58d8: Status 404 returned error can't find the container with id 34decb527184bd4ff609a070de28e4da18ec18094d195f97919d2454806f58d8 Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.290013 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tg9k8"] Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.927336 4781 generic.go:334] "Generic (PLEG): container finished" podID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerID="363437972dc1edd0a85fa61204497c017a7b8e034221df5e68a301f8138ef7f7" exitCode=0 Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.927685 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" event={"ID":"5f47f2d5-f4d5-448d-9355-ebe37959b584","Type":"ContainerDied","Data":"363437972dc1edd0a85fa61204497c017a7b8e034221df5e68a301f8138ef7f7"} Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.927747 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" event={"ID":"5f47f2d5-f4d5-448d-9355-ebe37959b584","Type":"ContainerStarted","Data":"e0e61b6d097a768cedf938a2051e02fe6b26d59774f1dfea50ad4f92d0779d0a"} Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.938276 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" event={"ID":"b607db2c-2aa3-48f0-9cd8-c5461797431c","Type":"ContainerStarted","Data":"39276ac01bb5ee770105ba2bf75f8d61d8081e22c89cdaa97c9f7ed7f2722110"} Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.938321 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" event={"ID":"b607db2c-2aa3-48f0-9cd8-c5461797431c","Type":"ContainerStarted","Data":"34decb527184bd4ff609a070de28e4da18ec18094d195f97919d2454806f58d8"} Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.996049 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" podStartSLOduration=1.9960263889999998 podStartE2EDuration="1.996026389s" podCreationTimestamp="2026-02-27 00:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:12.97495893 +0000 UTC m=+1422.232498484" watchObservedRunningTime="2026-02-27 00:29:12.996026389 +0000 UTC m=+1422.253565943" Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.634570 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.656331 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:13 crc kubenswrapper[4781]: E0227 00:29:13.714533 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c0d1328_b565_4c9e_a9dc_e7b863568260.slice/crio-conmon-cfcdb38663d80d12b7e86a05dfe2ce7cc23ff17e6af4e336ba2f0e4a180806c3.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.963054 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" event={"ID":"5f47f2d5-f4d5-448d-9355-ebe37959b584","Type":"ContainerStarted","Data":"26d79208d95dcfd480e6dcf5e635ea74d70976218b9d0db2771a4aca513d9249"} Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.963690 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.975033 4781 generic.go:334] "Generic (PLEG): container finished" podID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerID="cfcdb38663d80d12b7e86a05dfe2ce7cc23ff17e6af4e336ba2f0e4a180806c3" exitCode=137 Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.975266 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerDied","Data":"cfcdb38663d80d12b7e86a05dfe2ce7cc23ff17e6af4e336ba2f0e4a180806c3"} Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.991370 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" podStartSLOduration=3.9913453089999997 podStartE2EDuration="3.991345309s" podCreationTimestamp="2026-02-27 00:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:13.980686371 +0000 UTC m=+1423.238225925" watchObservedRunningTime="2026-02-27 00:29:13.991345309 +0000 UTC m=+1423.248884863" Feb 27 00:29:14 crc kubenswrapper[4781]: I0227 00:29:14.990539 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerDied","Data":"72f31190f456d76c84b55f85362321bc4ca382df7c3a1c86e9e23616be0d7246"} Feb 27 00:29:14 crc kubenswrapper[4781]: I0227 00:29:14.990600 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72f31190f456d76c84b55f85362321bc4ca382df7c3a1c86e9e23616be0d7246" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.087721 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.220823 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-config-data\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.220995 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-log-httpd\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.221100 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-run-httpd\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.221127 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-combined-ca-bundle\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.221163 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-scripts\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.221200 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hs5n\" (UniqueName: \"kubernetes.io/projected/6c0d1328-b565-4c9e-a9dc-e7b863568260-kube-api-access-2hs5n\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.221226 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-sg-core-conf-yaml\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.221585 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.222190 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.222214 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.227937 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-scripts" (OuterVolumeSpecName: "scripts") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.232704 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c0d1328-b565-4c9e-a9dc-e7b863568260-kube-api-access-2hs5n" (OuterVolumeSpecName: "kube-api-access-2hs5n") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "kube-api-access-2hs5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.274267 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.324422 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.324457 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.324467 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hs5n\" (UniqueName: \"kubernetes.io/projected/6c0d1328-b565-4c9e-a9dc-e7b863568260-kube-api-access-2hs5n\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.324477 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.325445 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.365383 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-config-data" (OuterVolumeSpecName: "config-data") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.426882 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.427236 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.003073 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.042653 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.054108 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.079942 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:16 crc kubenswrapper[4781]: E0227 00:29:16.080401 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-notification-agent" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080427 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-notification-agent" Feb 27 00:29:16 crc kubenswrapper[4781]: E0227 00:29:16.080446 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="proxy-httpd" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080457 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="proxy-httpd" Feb 27 00:29:16 crc kubenswrapper[4781]: E0227 00:29:16.080500 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-central-agent" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080511 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-central-agent" Feb 27 00:29:16 crc kubenswrapper[4781]: E0227 00:29:16.080520 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="sg-core" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080528 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="sg-core" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080822 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-central-agent" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080849 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="sg-core" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080867 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-notification-agent" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080891 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="proxy-httpd" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.083204 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.088015 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.088298 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.089596 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.245811 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.245854 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-run-httpd\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.245880 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-scripts\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.246082 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tthpk\" (UniqueName: \"kubernetes.io/projected/7825d67f-c124-4ee9-9e74-32c35c4370c0-kube-api-access-tthpk\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.246205 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.246469 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-log-httpd\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.246529 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-config-data\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348565 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-log-httpd\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348641 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-config-data\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348709 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348738 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-run-httpd\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348777 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-scripts\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348881 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tthpk\" (UniqueName: \"kubernetes.io/projected/7825d67f-c124-4ee9-9e74-32c35c4370c0-kube-api-access-tthpk\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348971 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.349194 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-log-httpd\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.349596 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-run-httpd\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.354056 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.355233 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.355649 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-scripts\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.359126 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-config-data\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.376772 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tthpk\" (UniqueName: \"kubernetes.io/projected/7825d67f-c124-4ee9-9e74-32c35c4370c0-kube-api-access-tthpk\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.436752 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.024118 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.025863 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"902efa6b-d07e-4589-b6e6-8016dfdbcd57","Type":"ContainerStarted","Data":"df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03"} Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.027920 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de057148-8197-4717-bbcc-636e6d64344a","Type":"ContainerStarted","Data":"d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de"} Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.040732 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8d780ba2-9829-430e-9a56-0b5b052bfbb7","Type":"ContainerStarted","Data":"7cb922ac2fcfd76994a7254d975044d1fe0a7563db3547acc86bfb78f94c47a2"} Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.040804 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8d780ba2-9829-430e-9a56-0b5b052bfbb7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7cb922ac2fcfd76994a7254d975044d1fe0a7563db3547acc86bfb78f94c47a2" gracePeriod=30 Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.044973 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809e4ffe-9885-43b8-bb34-b748437f1bb9","Type":"ContainerStarted","Data":"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399"} Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.050884 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.353703748 podStartE2EDuration="8.050865616s" podCreationTimestamp="2026-02-27 00:29:09 +0000 UTC" firstStartedPulling="2026-02-27 00:29:11.649971965 +0000 UTC m=+1420.907511519" lastFinishedPulling="2026-02-27 00:29:16.347133823 +0000 UTC m=+1425.604673387" observedRunningTime="2026-02-27 00:29:17.048930093 +0000 UTC m=+1426.306469647" watchObservedRunningTime="2026-02-27 00:29:17.050865616 +0000 UTC m=+1426.308405170" Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.077935 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.377793597 podStartE2EDuration="8.077913306s" podCreationTimestamp="2026-02-27 00:29:09 +0000 UTC" firstStartedPulling="2026-02-27 00:29:11.649585924 +0000 UTC m=+1420.907125478" lastFinishedPulling="2026-02-27 00:29:16.349705623 +0000 UTC m=+1425.607245187" observedRunningTime="2026-02-27 00:29:17.070891427 +0000 UTC m=+1426.328430981" watchObservedRunningTime="2026-02-27 00:29:17.077913306 +0000 UTC m=+1426.335452860" Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.337170 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" path="/var/lib/kubelet/pods/6c0d1328-b565-4c9e-a9dc-e7b863568260/volumes" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.057078 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"902efa6b-d07e-4589-b6e6-8016dfdbcd57","Type":"ContainerStarted","Data":"ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e"} Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.059389 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809e4ffe-9885-43b8-bb34-b748437f1bb9","Type":"ContainerStarted","Data":"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6"} Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.059395 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-log" containerID="cri-o://4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399" gracePeriod=30 Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.059443 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-metadata" containerID="cri-o://283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6" gracePeriod=30 Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.061319 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerStarted","Data":"a90b7ced7061699d62e894c9b3b31c21fe93acf06b438953563f0da53923c22d"} Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.084527 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.07967191 podStartE2EDuration="9.084511021s" podCreationTimestamp="2026-02-27 00:29:09 +0000 UTC" firstStartedPulling="2026-02-27 00:29:11.567221409 +0000 UTC m=+1420.824760963" lastFinishedPulling="2026-02-27 00:29:16.57206052 +0000 UTC m=+1425.829600074" observedRunningTime="2026-02-27 00:29:18.083526474 +0000 UTC m=+1427.341066028" watchObservedRunningTime="2026-02-27 00:29:18.084511021 +0000 UTC m=+1427.342050575" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.105515 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.092074905 podStartE2EDuration="9.105493248s" podCreationTimestamp="2026-02-27 00:29:09 +0000 UTC" firstStartedPulling="2026-02-27 00:29:11.523817446 +0000 UTC m=+1420.781357000" lastFinishedPulling="2026-02-27 00:29:16.537235789 +0000 UTC m=+1425.794775343" observedRunningTime="2026-02-27 00:29:18.099321511 +0000 UTC m=+1427.356861065" watchObservedRunningTime="2026-02-27 00:29:18.105493248 +0000 UTC m=+1427.363032802" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.811364 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.909784 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrbkq\" (UniqueName: \"kubernetes.io/projected/809e4ffe-9885-43b8-bb34-b748437f1bb9-kube-api-access-qrbkq\") pod \"809e4ffe-9885-43b8-bb34-b748437f1bb9\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.909843 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-combined-ca-bundle\") pod \"809e4ffe-9885-43b8-bb34-b748437f1bb9\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.910017 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-config-data\") pod \"809e4ffe-9885-43b8-bb34-b748437f1bb9\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.910098 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/809e4ffe-9885-43b8-bb34-b748437f1bb9-logs\") pod \"809e4ffe-9885-43b8-bb34-b748437f1bb9\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.910886 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/809e4ffe-9885-43b8-bb34-b748437f1bb9-logs" (OuterVolumeSpecName: "logs") pod "809e4ffe-9885-43b8-bb34-b748437f1bb9" (UID: "809e4ffe-9885-43b8-bb34-b748437f1bb9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.919920 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/809e4ffe-9885-43b8-bb34-b748437f1bb9-kube-api-access-qrbkq" (OuterVolumeSpecName: "kube-api-access-qrbkq") pod "809e4ffe-9885-43b8-bb34-b748437f1bb9" (UID: "809e4ffe-9885-43b8-bb34-b748437f1bb9"). InnerVolumeSpecName "kube-api-access-qrbkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.943045 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "809e4ffe-9885-43b8-bb34-b748437f1bb9" (UID: "809e4ffe-9885-43b8-bb34-b748437f1bb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.969726 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-config-data" (OuterVolumeSpecName: "config-data") pod "809e4ffe-9885-43b8-bb34-b748437f1bb9" (UID: "809e4ffe-9885-43b8-bb34-b748437f1bb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.012941 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrbkq\" (UniqueName: \"kubernetes.io/projected/809e4ffe-9885-43b8-bb34-b748437f1bb9-kube-api-access-qrbkq\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.012975 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.012984 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.012994 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/809e4ffe-9885-43b8-bb34-b748437f1bb9-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098074 4781 generic.go:334] "Generic (PLEG): container finished" podID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerID="283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6" exitCode=0 Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098487 4781 generic.go:334] "Generic (PLEG): container finished" podID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerID="4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399" exitCode=143 Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098318 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809e4ffe-9885-43b8-bb34-b748437f1bb9","Type":"ContainerDied","Data":"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6"} Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098583 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809e4ffe-9885-43b8-bb34-b748437f1bb9","Type":"ContainerDied","Data":"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399"} Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098600 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809e4ffe-9885-43b8-bb34-b748437f1bb9","Type":"ContainerDied","Data":"3a3cfa9569cf1e101c985b875f586bf5df5c1e9c190016bf01cb0461f1a4b9c8"} Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098620 4781 scope.go:117] "RemoveContainer" containerID="283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098405 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.112148 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerStarted","Data":"3ddb72adfd8dadbe432eb551c304f261946dae5663273b00c2b5c6ab9ec5b0b1"} Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.154811 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.159846 4781 scope.go:117] "RemoveContainer" containerID="4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.183888 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.185207 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:19 crc kubenswrapper[4781]: E0227 00:29:19.185859 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-metadata" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.185959 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-metadata" Feb 27 00:29:19 crc kubenswrapper[4781]: E0227 00:29:19.186048 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-log" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.186107 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-log" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.186355 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-log" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.186440 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-metadata" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.187587 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.192910 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.193167 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.208294 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.210913 4781 scope.go:117] "RemoveContainer" containerID="283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6" Feb 27 00:29:19 crc kubenswrapper[4781]: E0227 00:29:19.211338 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6\": container with ID starting with 283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6 not found: ID does not exist" containerID="283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.211479 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6"} err="failed to get container status \"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6\": rpc error: code = NotFound desc = could not find container \"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6\": container with ID starting with 283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6 not found: ID does not exist" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.211584 4781 scope.go:117] "RemoveContainer" containerID="4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399" Feb 27 00:29:19 crc kubenswrapper[4781]: E0227 00:29:19.211897 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399\": container with ID starting with 4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399 not found: ID does not exist" containerID="4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.211988 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399"} err="failed to get container status \"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399\": rpc error: code = NotFound desc = could not find container \"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399\": container with ID starting with 4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399 not found: ID does not exist" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.212058 4781 scope.go:117] "RemoveContainer" containerID="283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.212288 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6"} err="failed to get container status \"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6\": rpc error: code = NotFound desc = could not find container \"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6\": container with ID starting with 283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6 not found: ID does not exist" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.212378 4781 scope.go:117] "RemoveContainer" containerID="4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.212647 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399"} err="failed to get container status \"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399\": rpc error: code = NotFound desc = could not find container \"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399\": container with ID starting with 4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399 not found: ID does not exist" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.319395 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.319459 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed68203-3ac6-4133-92d9-175f234d5229-logs\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.319529 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-config-data\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.319580 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c25tq\" (UniqueName: \"kubernetes.io/projected/5ed68203-3ac6-4133-92d9-175f234d5229-kube-api-access-c25tq\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.319601 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.327820 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" path="/var/lib/kubelet/pods/809e4ffe-9885-43b8-bb34-b748437f1bb9/volumes" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.421482 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c25tq\" (UniqueName: \"kubernetes.io/projected/5ed68203-3ac6-4133-92d9-175f234d5229-kube-api-access-c25tq\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.421831 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.421953 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.422005 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed68203-3ac6-4133-92d9-175f234d5229-logs\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.422082 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-config-data\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.422911 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed68203-3ac6-4133-92d9-175f234d5229-logs\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.425901 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-config-data\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.426486 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.434711 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.437722 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c25tq\" (UniqueName: \"kubernetes.io/projected/5ed68203-3ac6-4133-92d9-175f234d5229-kube-api-access-c25tq\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.526153 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.040942 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:20 crc kubenswrapper[4781]: W0227 00:29:20.043061 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ed68203_3ac6_4133_92d9_175f234d5229.slice/crio-1402200c1e03e61992b31653885e83e4b38fc47073559c639e90d417e5d65cb7 WatchSource:0}: Error finding container 1402200c1e03e61992b31653885e83e4b38fc47073559c639e90d417e5d65cb7: Status 404 returned error can't find the container with id 1402200c1e03e61992b31653885e83e4b38fc47073559c639e90d417e5d65cb7 Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.133302 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ed68203-3ac6-4133-92d9-175f234d5229","Type":"ContainerStarted","Data":"1402200c1e03e61992b31653885e83e4b38fc47073559c639e90d417e5d65cb7"} Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.135923 4781 generic.go:334] "Generic (PLEG): container finished" podID="cd521dc6-4126-4c51-8634-66db8ba1412e" containerID="c9388f02af5b31dc8f5e8ea62ee66fb19cbab695e94e5d03ed46c036e292ce69" exitCode=0 Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.135974 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qjkwv" event={"ID":"cd521dc6-4126-4c51-8634-66db8ba1412e","Type":"ContainerDied","Data":"c9388f02af5b31dc8f5e8ea62ee66fb19cbab695e94e5d03ed46c036e292ce69"} Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.147275 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerStarted","Data":"fd9909df11f574e0138a430f34c72bf18ace2c57464e54425e45df0b7fd14f75"} Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.147315 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerStarted","Data":"9a609615b1e77c141503575f4b85bd73b7b9605cdd075e949757163fb3230f19"} Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.464081 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.464132 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.521542 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.521596 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.556724 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.561069 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.618343 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.715685 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-5mf9t"] Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.715954 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerName="dnsmasq-dns" containerID="cri-o://bb8c0d69bd70d80999cf07d7e8306d44a8648ef91de2762edd1a659e5f8fb1d6" gracePeriod=10 Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.164901 4781 generic.go:334] "Generic (PLEG): container finished" podID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerID="bb8c0d69bd70d80999cf07d7e8306d44a8648ef91de2762edd1a659e5f8fb1d6" exitCode=0 Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.164988 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" event={"ID":"39b2afc0-76d7-48e9-8528-f88e3ba22955","Type":"ContainerDied","Data":"bb8c0d69bd70d80999cf07d7e8306d44a8648ef91de2762edd1a659e5f8fb1d6"} Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.173690 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ed68203-3ac6-4133-92d9-175f234d5229","Type":"ContainerStarted","Data":"be2fe215086cd4058aea52c301ed09e04ac3143d7e54d38772b785701e47e5f8"} Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.174049 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ed68203-3ac6-4133-92d9-175f234d5229","Type":"ContainerStarted","Data":"bfa97c01ece2e8cbadd8eda7e12994d67d495e411ba60ed25dc9b412019a8f03"} Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.202136 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.202098776 podStartE2EDuration="2.202098776s" podCreationTimestamp="2026-02-27 00:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:21.197860642 +0000 UTC m=+1430.455400216" watchObservedRunningTime="2026-02-27 00:29:21.202098776 +0000 UTC m=+1430.459638330" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.253153 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.431682 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.471907 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-nb\") pod \"39b2afc0-76d7-48e9-8528-f88e3ba22955\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.471981 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-svc\") pod \"39b2afc0-76d7-48e9-8528-f88e3ba22955\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.472131 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-sb\") pod \"39b2afc0-76d7-48e9-8528-f88e3ba22955\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.472186 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-config\") pod \"39b2afc0-76d7-48e9-8528-f88e3ba22955\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.472227 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-swift-storage-0\") pod \"39b2afc0-76d7-48e9-8528-f88e3ba22955\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.472350 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4nk7\" (UniqueName: \"kubernetes.io/projected/39b2afc0-76d7-48e9-8528-f88e3ba22955-kube-api-access-w4nk7\") pod \"39b2afc0-76d7-48e9-8528-f88e3ba22955\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.481936 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b2afc0-76d7-48e9-8528-f88e3ba22955-kube-api-access-w4nk7" (OuterVolumeSpecName: "kube-api-access-w4nk7") pod "39b2afc0-76d7-48e9-8528-f88e3ba22955" (UID: "39b2afc0-76d7-48e9-8528-f88e3ba22955"). InnerVolumeSpecName "kube-api-access-w4nk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.549305 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.549433 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.575317 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4nk7\" (UniqueName: \"kubernetes.io/projected/39b2afc0-76d7-48e9-8528-f88e3ba22955-kube-api-access-w4nk7\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.579320 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39b2afc0-76d7-48e9-8528-f88e3ba22955" (UID: "39b2afc0-76d7-48e9-8528-f88e3ba22955"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.591463 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39b2afc0-76d7-48e9-8528-f88e3ba22955" (UID: "39b2afc0-76d7-48e9-8528-f88e3ba22955"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.606149 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-config" (OuterVolumeSpecName: "config") pod "39b2afc0-76d7-48e9-8528-f88e3ba22955" (UID: "39b2afc0-76d7-48e9-8528-f88e3ba22955"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.621500 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "39b2afc0-76d7-48e9-8528-f88e3ba22955" (UID: "39b2afc0-76d7-48e9-8528-f88e3ba22955"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.623398 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39b2afc0-76d7-48e9-8528-f88e3ba22955" (UID: "39b2afc0-76d7-48e9-8528-f88e3ba22955"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.679895 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.679928 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.679938 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.679948 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.679977 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.810239 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.883296 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7flr\" (UniqueName: \"kubernetes.io/projected/cd521dc6-4126-4c51-8634-66db8ba1412e-kube-api-access-f7flr\") pod \"cd521dc6-4126-4c51-8634-66db8ba1412e\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.883382 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-combined-ca-bundle\") pod \"cd521dc6-4126-4c51-8634-66db8ba1412e\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.883477 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-scripts\") pod \"cd521dc6-4126-4c51-8634-66db8ba1412e\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.883580 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-config-data\") pod \"cd521dc6-4126-4c51-8634-66db8ba1412e\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.887641 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd521dc6-4126-4c51-8634-66db8ba1412e-kube-api-access-f7flr" (OuterVolumeSpecName: "kube-api-access-f7flr") pod "cd521dc6-4126-4c51-8634-66db8ba1412e" (UID: "cd521dc6-4126-4c51-8634-66db8ba1412e"). InnerVolumeSpecName "kube-api-access-f7flr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.899267 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-scripts" (OuterVolumeSpecName: "scripts") pod "cd521dc6-4126-4c51-8634-66db8ba1412e" (UID: "cd521dc6-4126-4c51-8634-66db8ba1412e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.920328 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd521dc6-4126-4c51-8634-66db8ba1412e" (UID: "cd521dc6-4126-4c51-8634-66db8ba1412e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.942834 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-config-data" (OuterVolumeSpecName: "config-data") pod "cd521dc6-4126-4c51-8634-66db8ba1412e" (UID: "cd521dc6-4126-4c51-8634-66db8ba1412e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.986284 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.986315 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.986324 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.986333 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7flr\" (UniqueName: \"kubernetes.io/projected/cd521dc6-4126-4c51-8634-66db8ba1412e-kube-api-access-f7flr\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.204734 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qjkwv" event={"ID":"cd521dc6-4126-4c51-8634-66db8ba1412e","Type":"ContainerDied","Data":"6b1e78ae032b9557d03ea57a421dc5b2962405bd66d1c8415a0c89f4e9888284"} Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.204772 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b1e78ae032b9557d03ea57a421dc5b2962405bd66d1c8415a0c89f4e9888284" Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.204836 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.227140 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.228140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" event={"ID":"39b2afc0-76d7-48e9-8528-f88e3ba22955","Type":"ContainerDied","Data":"ede845938dcbb2c0e3303591186eb47bf17d10a92d1b0dd61b8430ff2dd6aa13"} Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.228221 4781 scope.go:117] "RemoveContainer" containerID="bb8c0d69bd70d80999cf07d7e8306d44a8648ef91de2762edd1a659e5f8fb1d6" Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.293735 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.294021 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-log" containerID="cri-o://df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03" gracePeriod=30 Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.294274 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-api" containerID="cri-o://ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e" gracePeriod=30 Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.306501 4781 scope.go:117] "RemoveContainer" containerID="ba0fa606453c74eda00c418113d9f320bbbe55741c968eedcc82d3ff7571054d" Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.313699 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.343479 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-5mf9t"] Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.354911 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-5mf9t"] Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.365697 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.236968 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerStarted","Data":"78c52f488afaab989176f5c5ab096fb61a4e74a72dc5d52ce83048b14f67d902"} Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.237460 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.240356 4781 generic.go:334] "Generic (PLEG): container finished" podID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerID="df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03" exitCode=143 Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.240518 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="de057148-8197-4717-bbcc-636e6d64344a" containerName="nova-scheduler-scheduler" containerID="cri-o://d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de" gracePeriod=30 Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.240747 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"902efa6b-d07e-4589-b6e6-8016dfdbcd57","Type":"ContainerDied","Data":"df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03"} Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.241179 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-metadata" containerID="cri-o://be2fe215086cd4058aea52c301ed09e04ac3143d7e54d38772b785701e47e5f8" gracePeriod=30 Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.241341 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-log" containerID="cri-o://bfa97c01ece2e8cbadd8eda7e12994d67d495e411ba60ed25dc9b412019a8f03" gracePeriod=30 Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.303360 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3069766400000002 podStartE2EDuration="7.303338343s" podCreationTimestamp="2026-02-27 00:29:16 +0000 UTC" firstStartedPulling="2026-02-27 00:29:17.040825984 +0000 UTC m=+1426.298365538" lastFinishedPulling="2026-02-27 00:29:22.037187687 +0000 UTC m=+1431.294727241" observedRunningTime="2026-02-27 00:29:23.288527283 +0000 UTC m=+1432.546066837" watchObservedRunningTime="2026-02-27 00:29:23.303338343 +0000 UTC m=+1432.560877897" Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.324683 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" path="/var/lib/kubelet/pods/39b2afc0-76d7-48e9-8528-f88e3ba22955/volumes" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.252226 4781 generic.go:334] "Generic (PLEG): container finished" podID="5ed68203-3ac6-4133-92d9-175f234d5229" containerID="be2fe215086cd4058aea52c301ed09e04ac3143d7e54d38772b785701e47e5f8" exitCode=0 Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.252601 4781 generic.go:334] "Generic (PLEG): container finished" podID="5ed68203-3ac6-4133-92d9-175f234d5229" containerID="bfa97c01ece2e8cbadd8eda7e12994d67d495e411ba60ed25dc9b412019a8f03" exitCode=143 Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.254231 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ed68203-3ac6-4133-92d9-175f234d5229","Type":"ContainerDied","Data":"be2fe215086cd4058aea52c301ed09e04ac3143d7e54d38772b785701e47e5f8"} Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.254286 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ed68203-3ac6-4133-92d9-175f234d5229","Type":"ContainerDied","Data":"bfa97c01ece2e8cbadd8eda7e12994d67d495e411ba60ed25dc9b412019a8f03"} Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.254312 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ed68203-3ac6-4133-92d9-175f234d5229","Type":"ContainerDied","Data":"1402200c1e03e61992b31653885e83e4b38fc47073559c639e90d417e5d65cb7"} Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.254330 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1402200c1e03e61992b31653885e83e4b38fc47073559c639e90d417e5d65cb7" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.325079 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.437529 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-config-data\") pod \"5ed68203-3ac6-4133-92d9-175f234d5229\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.437610 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-combined-ca-bundle\") pod \"5ed68203-3ac6-4133-92d9-175f234d5229\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.437727 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c25tq\" (UniqueName: \"kubernetes.io/projected/5ed68203-3ac6-4133-92d9-175f234d5229-kube-api-access-c25tq\") pod \"5ed68203-3ac6-4133-92d9-175f234d5229\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.437796 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed68203-3ac6-4133-92d9-175f234d5229-logs\") pod \"5ed68203-3ac6-4133-92d9-175f234d5229\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.437997 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-nova-metadata-tls-certs\") pod \"5ed68203-3ac6-4133-92d9-175f234d5229\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.438255 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed68203-3ac6-4133-92d9-175f234d5229-logs" (OuterVolumeSpecName: "logs") pod "5ed68203-3ac6-4133-92d9-175f234d5229" (UID: "5ed68203-3ac6-4133-92d9-175f234d5229"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.438801 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed68203-3ac6-4133-92d9-175f234d5229-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.446676 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed68203-3ac6-4133-92d9-175f234d5229-kube-api-access-c25tq" (OuterVolumeSpecName: "kube-api-access-c25tq") pod "5ed68203-3ac6-4133-92d9-175f234d5229" (UID: "5ed68203-3ac6-4133-92d9-175f234d5229"). InnerVolumeSpecName "kube-api-access-c25tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.466373 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-config-data" (OuterVolumeSpecName: "config-data") pod "5ed68203-3ac6-4133-92d9-175f234d5229" (UID: "5ed68203-3ac6-4133-92d9-175f234d5229"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.485619 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ed68203-3ac6-4133-92d9-175f234d5229" (UID: "5ed68203-3ac6-4133-92d9-175f234d5229"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.496314 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5ed68203-3ac6-4133-92d9-175f234d5229" (UID: "5ed68203-3ac6-4133-92d9-175f234d5229"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.541179 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c25tq\" (UniqueName: \"kubernetes.io/projected/5ed68203-3ac6-4133-92d9-175f234d5229-kube-api-access-c25tq\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.541223 4781 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.541237 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.541250 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.185607 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.257167 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-config-data\") pod \"de057148-8197-4717-bbcc-636e6d64344a\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.257254 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ccw8\" (UniqueName: \"kubernetes.io/projected/de057148-8197-4717-bbcc-636e6d64344a-kube-api-access-9ccw8\") pod \"de057148-8197-4717-bbcc-636e6d64344a\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.257366 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-combined-ca-bundle\") pod \"de057148-8197-4717-bbcc-636e6d64344a\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.264160 4781 generic.go:334] "Generic (PLEG): container finished" podID="de057148-8197-4717-bbcc-636e6d64344a" containerID="d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de" exitCode=0 Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.264239 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.264297 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.264832 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de057148-8197-4717-bbcc-636e6d64344a","Type":"ContainerDied","Data":"d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de"} Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.264865 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de057148-8197-4717-bbcc-636e6d64344a","Type":"ContainerDied","Data":"5ecdf1c41abef4437c80f6d85c04db80a9d6858579c757ef6823795e81d59b23"} Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.264886 4781 scope.go:117] "RemoveContainer" containerID="d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.278795 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de057148-8197-4717-bbcc-636e6d64344a-kube-api-access-9ccw8" (OuterVolumeSpecName: "kube-api-access-9ccw8") pod "de057148-8197-4717-bbcc-636e6d64344a" (UID: "de057148-8197-4717-bbcc-636e6d64344a"). InnerVolumeSpecName "kube-api-access-9ccw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.293307 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de057148-8197-4717-bbcc-636e6d64344a" (UID: "de057148-8197-4717-bbcc-636e6d64344a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.303721 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-config-data" (OuterVolumeSpecName: "config-data") pod "de057148-8197-4717-bbcc-636e6d64344a" (UID: "de057148-8197-4717-bbcc-636e6d64344a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.361292 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.361335 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ccw8\" (UniqueName: \"kubernetes.io/projected/de057148-8197-4717-bbcc-636e6d64344a-kube-api-access-9ccw8\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.361354 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.382210 4781 scope.go:117] "RemoveContainer" containerID="d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de" Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.385006 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de\": container with ID starting with d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de not found: ID does not exist" containerID="d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.385059 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de"} err="failed to get container status \"d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de\": rpc error: code = NotFound desc = could not find container \"d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de\": container with ID starting with d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de not found: ID does not exist" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.394780 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.403699 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.415998 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.416407 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-log" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416427 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-log" Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.416443 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de057148-8197-4717-bbcc-636e6d64344a" containerName="nova-scheduler-scheduler" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416450 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="de057148-8197-4717-bbcc-636e6d64344a" containerName="nova-scheduler-scheduler" Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.416465 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd521dc6-4126-4c51-8634-66db8ba1412e" containerName="nova-manage" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416472 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd521dc6-4126-4c51-8634-66db8ba1412e" containerName="nova-manage" Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.416487 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerName="dnsmasq-dns" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416492 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerName="dnsmasq-dns" Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.416501 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-metadata" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416508 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-metadata" Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.416523 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerName="init" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416530 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerName="init" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416741 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-log" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416754 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-metadata" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416769 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd521dc6-4126-4c51-8634-66db8ba1412e" containerName="nova-manage" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416779 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerName="dnsmasq-dns" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416804 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="de057148-8197-4717-bbcc-636e6d64344a" containerName="nova-scheduler-scheduler" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.417791 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.420555 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.420791 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.446391 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.462674 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rnf\" (UniqueName: \"kubernetes.io/projected/7524846b-772f-47a1-aaae-e7f29db2c0b5-kube-api-access-94rnf\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.462720 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-config-data\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.462779 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.462826 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.462851 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7524846b-772f-47a1-aaae-e7f29db2c0b5-logs\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.564778 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94rnf\" (UniqueName: \"kubernetes.io/projected/7524846b-772f-47a1-aaae-e7f29db2c0b5-kube-api-access-94rnf\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.564828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-config-data\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.564900 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.564950 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.564985 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7524846b-772f-47a1-aaae-e7f29db2c0b5-logs\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.565577 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7524846b-772f-47a1-aaae-e7f29db2c0b5-logs\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.568529 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.568671 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.570193 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-config-data\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.586214 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rnf\" (UniqueName: \"kubernetes.io/projected/7524846b-772f-47a1-aaae-e7f29db2c0b5-kube-api-access-94rnf\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.622787 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.644075 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.663533 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.665241 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.669744 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.675845 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.734242 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.769971 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-config-data\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.770170 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrw78\" (UniqueName: \"kubernetes.io/projected/f5e01f6b-d306-41ac-9988-156063c5af7d-kube-api-access-mrw78\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.770200 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.872363 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-config-data\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.872876 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrw78\" (UniqueName: \"kubernetes.io/projected/f5e01f6b-d306-41ac-9988-156063c5af7d-kube-api-access-mrw78\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.872908 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.880719 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-config-data\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.889948 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrw78\" (UniqueName: \"kubernetes.io/projected/f5e01f6b-d306-41ac-9988-156063c5af7d-kube-api-access-mrw78\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.890950 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.980544 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:29:26 crc kubenswrapper[4781]: I0227 00:29:26.195394 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:26 crc kubenswrapper[4781]: W0227 00:29:26.198807 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7524846b_772f_47a1_aaae_e7f29db2c0b5.slice/crio-38c766ee39e3c7f63ba0e025dfac0a7e85784b006de99cf2ffb71128d62e3b91 WatchSource:0}: Error finding container 38c766ee39e3c7f63ba0e025dfac0a7e85784b006de99cf2ffb71128d62e3b91: Status 404 returned error can't find the container with id 38c766ee39e3c7f63ba0e025dfac0a7e85784b006de99cf2ffb71128d62e3b91 Feb 27 00:29:26 crc kubenswrapper[4781]: I0227 00:29:26.275686 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7524846b-772f-47a1-aaae-e7f29db2c0b5","Type":"ContainerStarted","Data":"38c766ee39e3c7f63ba0e025dfac0a7e85784b006de99cf2ffb71128d62e3b91"} Feb 27 00:29:26 crc kubenswrapper[4781]: I0227 00:29:26.492937 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:26 crc kubenswrapper[4781]: W0227 00:29:26.507068 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5e01f6b_d306_41ac_9988_156063c5af7d.slice/crio-9fd55676c924089cae058f2bc5bdb6090578b3476a36c5a733237f94e45c9618 WatchSource:0}: Error finding container 9fd55676c924089cae058f2bc5bdb6090578b3476a36c5a733237f94e45c9618: Status 404 returned error can't find the container with id 9fd55676c924089cae058f2bc5bdb6090578b3476a36c5a733237f94e45c9618 Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.300362 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7524846b-772f-47a1-aaae-e7f29db2c0b5","Type":"ContainerStarted","Data":"8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf"} Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.301215 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7524846b-772f-47a1-aaae-e7f29db2c0b5","Type":"ContainerStarted","Data":"a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9"} Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.302320 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5e01f6b-d306-41ac-9988-156063c5af7d","Type":"ContainerStarted","Data":"8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab"} Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.302365 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5e01f6b-d306-41ac-9988-156063c5af7d","Type":"ContainerStarted","Data":"9fd55676c924089cae058f2bc5bdb6090578b3476a36c5a733237f94e45c9618"} Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.305888 4781 generic.go:334] "Generic (PLEG): container finished" podID="b607db2c-2aa3-48f0-9cd8-c5461797431c" containerID="39276ac01bb5ee770105ba2bf75f8d61d8081e22c89cdaa97c9f7ed7f2722110" exitCode=0 Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.305947 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" event={"ID":"b607db2c-2aa3-48f0-9cd8-c5461797431c","Type":"ContainerDied","Data":"39276ac01bb5ee770105ba2bf75f8d61d8081e22c89cdaa97c9f7ed7f2722110"} Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.331683 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.331665443 podStartE2EDuration="2.331665443s" podCreationTimestamp="2026-02-27 00:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:27.323341918 +0000 UTC m=+1436.580881502" watchObservedRunningTime="2026-02-27 00:29:27.331665443 +0000 UTC m=+1436.589204997" Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.341377 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" path="/var/lib/kubelet/pods/5ed68203-3ac6-4133-92d9-175f234d5229/volumes" Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.342209 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de057148-8197-4717-bbcc-636e6d64344a" path="/var/lib/kubelet/pods/de057148-8197-4717-bbcc-636e6d64344a/volumes" Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.366856 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.366834733 podStartE2EDuration="2.366834733s" podCreationTimestamp="2026-02-27 00:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:27.356774352 +0000 UTC m=+1436.614313906" watchObservedRunningTime="2026-02-27 00:29:27.366834733 +0000 UTC m=+1436.624374287" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.271753 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.321284 4781 generic.go:334] "Generic (PLEG): container finished" podID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerID="ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e" exitCode=0 Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.321335 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.321383 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"902efa6b-d07e-4589-b6e6-8016dfdbcd57","Type":"ContainerDied","Data":"ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e"} Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.321411 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"902efa6b-d07e-4589-b6e6-8016dfdbcd57","Type":"ContainerDied","Data":"10dbcf9aa331b09eb162dae4f7eb67ae5890ce7956c09aaa8725da5e211a8996"} Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.321429 4781 scope.go:117] "RemoveContainer" containerID="ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.324788 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902efa6b-d07e-4589-b6e6-8016dfdbcd57-logs\") pod \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.324907 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-combined-ca-bundle\") pod \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.325041 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7gxp\" (UniqueName: \"kubernetes.io/projected/902efa6b-d07e-4589-b6e6-8016dfdbcd57-kube-api-access-x7gxp\") pod \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.325205 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-config-data\") pod \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.326921 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/902efa6b-d07e-4589-b6e6-8016dfdbcd57-logs" (OuterVolumeSpecName: "logs") pod "902efa6b-d07e-4589-b6e6-8016dfdbcd57" (UID: "902efa6b-d07e-4589-b6e6-8016dfdbcd57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.330921 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902efa6b-d07e-4589-b6e6-8016dfdbcd57-kube-api-access-x7gxp" (OuterVolumeSpecName: "kube-api-access-x7gxp") pod "902efa6b-d07e-4589-b6e6-8016dfdbcd57" (UID: "902efa6b-d07e-4589-b6e6-8016dfdbcd57"). InnerVolumeSpecName "kube-api-access-x7gxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.358208 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "902efa6b-d07e-4589-b6e6-8016dfdbcd57" (UID: "902efa6b-d07e-4589-b6e6-8016dfdbcd57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.363500 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-config-data" (OuterVolumeSpecName: "config-data") pod "902efa6b-d07e-4589-b6e6-8016dfdbcd57" (UID: "902efa6b-d07e-4589-b6e6-8016dfdbcd57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.432624 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7gxp\" (UniqueName: \"kubernetes.io/projected/902efa6b-d07e-4589-b6e6-8016dfdbcd57-kube-api-access-x7gxp\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.432699 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.432713 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902efa6b-d07e-4589-b6e6-8016dfdbcd57-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.432754 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.536643 4781 scope.go:117] "RemoveContainer" containerID="df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.571949 4781 scope.go:117] "RemoveContainer" containerID="ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e" Feb 27 00:29:28 crc kubenswrapper[4781]: E0227 00:29:28.583757 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e\": container with ID starting with ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e not found: ID does not exist" containerID="ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.583810 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e"} err="failed to get container status \"ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e\": rpc error: code = NotFound desc = could not find container \"ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e\": container with ID starting with ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e not found: ID does not exist" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.583841 4781 scope.go:117] "RemoveContainer" containerID="df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03" Feb 27 00:29:28 crc kubenswrapper[4781]: E0227 00:29:28.584873 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03\": container with ID starting with df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03 not found: ID does not exist" containerID="df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.584918 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03"} err="failed to get container status \"df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03\": rpc error: code = NotFound desc = could not find container \"df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03\": container with ID starting with df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03 not found: ID does not exist" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.659100 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.674859 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.687324 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:28 crc kubenswrapper[4781]: E0227 00:29:28.687835 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-api" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.687854 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-api" Feb 27 00:29:28 crc kubenswrapper[4781]: E0227 00:29:28.687869 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-log" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.687875 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-log" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.688110 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-log" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.688130 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-api" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.689338 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.692052 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.708431 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.731694 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.739225 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.739311 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-config-data\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.739456 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p577n\" (UniqueName: \"kubernetes.io/projected/e57ffac0-932b-42fd-bc09-ae357b25eeb1-kube-api-access-p577n\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.739478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ffac0-932b-42fd-bc09-ae357b25eeb1-logs\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.841705 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-combined-ca-bundle\") pod \"b607db2c-2aa3-48f0-9cd8-c5461797431c\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.841925 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-scripts\") pod \"b607db2c-2aa3-48f0-9cd8-c5461797431c\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.842039 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-config-data\") pod \"b607db2c-2aa3-48f0-9cd8-c5461797431c\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.842090 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7dh2\" (UniqueName: \"kubernetes.io/projected/b607db2c-2aa3-48f0-9cd8-c5461797431c-kube-api-access-f7dh2\") pod \"b607db2c-2aa3-48f0-9cd8-c5461797431c\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.842405 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p577n\" (UniqueName: \"kubernetes.io/projected/e57ffac0-932b-42fd-bc09-ae357b25eeb1-kube-api-access-p577n\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.842436 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ffac0-932b-42fd-bc09-ae357b25eeb1-logs\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.842561 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.842593 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-config-data\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.843124 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ffac0-932b-42fd-bc09-ae357b25eeb1-logs\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.845588 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b607db2c-2aa3-48f0-9cd8-c5461797431c-kube-api-access-f7dh2" (OuterVolumeSpecName: "kube-api-access-f7dh2") pod "b607db2c-2aa3-48f0-9cd8-c5461797431c" (UID: "b607db2c-2aa3-48f0-9cd8-c5461797431c"). InnerVolumeSpecName "kube-api-access-f7dh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.846862 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.852179 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-scripts" (OuterVolumeSpecName: "scripts") pod "b607db2c-2aa3-48f0-9cd8-c5461797431c" (UID: "b607db2c-2aa3-48f0-9cd8-c5461797431c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.852210 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-config-data\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.860673 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p577n\" (UniqueName: \"kubernetes.io/projected/e57ffac0-932b-42fd-bc09-ae357b25eeb1-kube-api-access-p577n\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.876568 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-config-data" (OuterVolumeSpecName: "config-data") pod "b607db2c-2aa3-48f0-9cd8-c5461797431c" (UID: "b607db2c-2aa3-48f0-9cd8-c5461797431c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.877269 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b607db2c-2aa3-48f0-9cd8-c5461797431c" (UID: "b607db2c-2aa3-48f0-9cd8-c5461797431c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.944109 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.944432 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.944443 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7dh2\" (UniqueName: \"kubernetes.io/projected/b607db2c-2aa3-48f0-9cd8-c5461797431c-kube-api-access-f7dh2\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.944454 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.025780 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.320019 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" path="/var/lib/kubelet/pods/902efa6b-d07e-4589-b6e6-8016dfdbcd57/volumes" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.334143 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" event={"ID":"b607db2c-2aa3-48f0-9cd8-c5461797431c","Type":"ContainerDied","Data":"34decb527184bd4ff609a070de28e4da18ec18094d195f97919d2454806f58d8"} Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.334181 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34decb527184bd4ff609a070de28e4da18ec18094d195f97919d2454806f58d8" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.334216 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.419806 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 27 00:29:29 crc kubenswrapper[4781]: E0227 00:29:29.421253 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607db2c-2aa3-48f0-9cd8-c5461797431c" containerName="nova-cell1-conductor-db-sync" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.421274 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607db2c-2aa3-48f0-9cd8-c5461797431c" containerName="nova-cell1-conductor-db-sync" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.421458 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607db2c-2aa3-48f0-9cd8-c5461797431c" containerName="nova-cell1-conductor-db-sync" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.422189 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.425551 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.430541 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.457125 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8c40a18-7bbd-4d06-8a8a-427de95016fa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.457173 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz6mf\" (UniqueName: \"kubernetes.io/projected/c8c40a18-7bbd-4d06-8a8a-427de95016fa-kube-api-access-rz6mf\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.457312 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8c40a18-7bbd-4d06-8a8a-427de95016fa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.502677 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.558967 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8c40a18-7bbd-4d06-8a8a-427de95016fa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.559057 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8c40a18-7bbd-4d06-8a8a-427de95016fa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.559091 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz6mf\" (UniqueName: \"kubernetes.io/projected/c8c40a18-7bbd-4d06-8a8a-427de95016fa-kube-api-access-rz6mf\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.566074 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8c40a18-7bbd-4d06-8a8a-427de95016fa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.566129 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8c40a18-7bbd-4d06-8a8a-427de95016fa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.575503 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz6mf\" (UniqueName: \"kubernetes.io/projected/c8c40a18-7bbd-4d06-8a8a-427de95016fa-kube-api-access-rz6mf\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.738286 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.193457 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 27 00:29:30 crc kubenswrapper[4781]: W0227 00:29:30.201909 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8c40a18_7bbd_4d06_8a8a_427de95016fa.slice/crio-586fdf84ca64b6fcd432c2744daca0f29ae595ee9ccb98b66f85d651192101c5 WatchSource:0}: Error finding container 586fdf84ca64b6fcd432c2744daca0f29ae595ee9ccb98b66f85d651192101c5: Status 404 returned error can't find the container with id 586fdf84ca64b6fcd432c2744daca0f29ae595ee9ccb98b66f85d651192101c5 Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.347515 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e57ffac0-932b-42fd-bc09-ae357b25eeb1","Type":"ContainerStarted","Data":"b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d"} Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.347571 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e57ffac0-932b-42fd-bc09-ae357b25eeb1","Type":"ContainerStarted","Data":"45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922"} Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.347586 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e57ffac0-932b-42fd-bc09-ae357b25eeb1","Type":"ContainerStarted","Data":"52f4435171fea776734e465dadfa7d220c142ef75d0364376751af62a2757023"} Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.350617 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c8c40a18-7bbd-4d06-8a8a-427de95016fa","Type":"ContainerStarted","Data":"586fdf84ca64b6fcd432c2744daca0f29ae595ee9ccb98b66f85d651192101c5"} Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.375599 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.375578358 podStartE2EDuration="2.375578358s" podCreationTimestamp="2026-02-27 00:29:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:30.366971636 +0000 UTC m=+1439.624511210" watchObservedRunningTime="2026-02-27 00:29:30.375578358 +0000 UTC m=+1439.633117912" Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.735408 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.735467 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.981563 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 27 00:29:31 crc kubenswrapper[4781]: I0227 00:29:31.360830 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c8c40a18-7bbd-4d06-8a8a-427de95016fa","Type":"ContainerStarted","Data":"6f523ef0991ad019f7285afab5b492d902d65f84dce4c9da8e302ff112aac4c6"} Feb 27 00:29:31 crc kubenswrapper[4781]: I0227 00:29:31.361640 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:31 crc kubenswrapper[4781]: I0227 00:29:31.381377 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.381357011 podStartE2EDuration="2.381357011s" podCreationTimestamp="2026-02-27 00:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:31.381189056 +0000 UTC m=+1440.638728610" watchObservedRunningTime="2026-02-27 00:29:31.381357011 +0000 UTC m=+1440.638896565" Feb 27 00:29:35 crc kubenswrapper[4781]: I0227 00:29:35.735345 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 00:29:35 crc kubenswrapper[4781]: I0227 00:29:35.736032 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 00:29:35 crc kubenswrapper[4781]: I0227 00:29:35.982224 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 27 00:29:36 crc kubenswrapper[4781]: I0227 00:29:36.026474 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 27 00:29:36 crc kubenswrapper[4781]: I0227 00:29:36.454885 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 27 00:29:36 crc kubenswrapper[4781]: I0227 00:29:36.749820 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:29:36 crc kubenswrapper[4781]: I0227 00:29:36.749895 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:29:39 crc kubenswrapper[4781]: I0227 00:29:39.028009 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:29:39 crc kubenswrapper[4781]: I0227 00:29:39.029203 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:29:39 crc kubenswrapper[4781]: I0227 00:29:39.770939 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:40 crc kubenswrapper[4781]: I0227 00:29:40.109987 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.227:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:29:40 crc kubenswrapper[4781]: I0227 00:29:40.110041 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.227:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:29:45 crc kubenswrapper[4781]: I0227 00:29:45.739469 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 00:29:45 crc kubenswrapper[4781]: I0227 00:29:45.740042 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 00:29:45 crc kubenswrapper[4781]: I0227 00:29:45.744005 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 00:29:46 crc kubenswrapper[4781]: I0227 00:29:46.446250 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 27 00:29:46 crc kubenswrapper[4781]: I0227 00:29:46.522192 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 00:29:47 crc kubenswrapper[4781]: E0227 00:29:47.312409 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d780ba2_9829_430e_9a56_0b5b052bfbb7.slice/crio-conmon-7cb922ac2fcfd76994a7254d975044d1fe0a7563db3547acc86bfb78f94c47a2.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.527321 4781 generic.go:334] "Generic (PLEG): container finished" podID="8d780ba2-9829-430e-9a56-0b5b052bfbb7" containerID="7cb922ac2fcfd76994a7254d975044d1fe0a7563db3547acc86bfb78f94c47a2" exitCode=137 Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.527414 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8d780ba2-9829-430e-9a56-0b5b052bfbb7","Type":"ContainerDied","Data":"7cb922ac2fcfd76994a7254d975044d1fe0a7563db3547acc86bfb78f94c47a2"} Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.527465 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8d780ba2-9829-430e-9a56-0b5b052bfbb7","Type":"ContainerDied","Data":"c962c4b95a57d2e6b554d146c58ed360df05b5f45a673155452c828cacdad50b"} Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.527481 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c962c4b95a57d2e6b554d146c58ed360df05b5f45a673155452c828cacdad50b" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.573698 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.736954 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-combined-ca-bundle\") pod \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.737138 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-config-data\") pod \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.737165 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxdkk\" (UniqueName: \"kubernetes.io/projected/8d780ba2-9829-430e-9a56-0b5b052bfbb7-kube-api-access-bxdkk\") pod \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.755476 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d780ba2-9829-430e-9a56-0b5b052bfbb7-kube-api-access-bxdkk" (OuterVolumeSpecName: "kube-api-access-bxdkk") pod "8d780ba2-9829-430e-9a56-0b5b052bfbb7" (UID: "8d780ba2-9829-430e-9a56-0b5b052bfbb7"). InnerVolumeSpecName "kube-api-access-bxdkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.775270 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-config-data" (OuterVolumeSpecName: "config-data") pod "8d780ba2-9829-430e-9a56-0b5b052bfbb7" (UID: "8d780ba2-9829-430e-9a56-0b5b052bfbb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.782762 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d780ba2-9829-430e-9a56-0b5b052bfbb7" (UID: "8d780ba2-9829-430e-9a56-0b5b052bfbb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.839282 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.839313 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.839324 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxdkk\" (UniqueName: \"kubernetes.io/projected/8d780ba2-9829-430e-9a56-0b5b052bfbb7-kube-api-access-bxdkk\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.534870 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.566267 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.576276 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.592669 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:48 crc kubenswrapper[4781]: E0227 00:29:48.593210 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d780ba2-9829-430e-9a56-0b5b052bfbb7" containerName="nova-cell1-novncproxy-novncproxy" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.593232 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d780ba2-9829-430e-9a56-0b5b052bfbb7" containerName="nova-cell1-novncproxy-novncproxy" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.593697 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d780ba2-9829-430e-9a56-0b5b052bfbb7" containerName="nova-cell1-novncproxy-novncproxy" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.594704 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.597552 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.597679 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.598061 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.604171 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.690417 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-49thr"] Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.695171 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.697553 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.697593 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.697743 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvfsn\" (UniqueName: \"kubernetes.io/projected/a3b399a8-7654-47f3-be04-759080f4f180-kube-api-access-fvfsn\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.697776 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.698082 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.722953 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-49thr"] Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.800587 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-utilities\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.800688 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.800709 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.800738 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-catalog-content\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.800775 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwm7q\" (UniqueName: \"kubernetes.io/projected/cc2a000c-f169-4622-8c82-cd4c2baa730a-kube-api-access-dwm7q\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.800940 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvfsn\" (UniqueName: \"kubernetes.io/projected/a3b399a8-7654-47f3-be04-759080f4f180-kube-api-access-fvfsn\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.801025 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.801055 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.806113 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.806156 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.807676 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.815683 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.825177 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvfsn\" (UniqueName: \"kubernetes.io/projected/a3b399a8-7654-47f3-be04-759080f4f180-kube-api-access-fvfsn\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.903096 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-utilities\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.903181 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-catalog-content\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.903224 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwm7q\" (UniqueName: \"kubernetes.io/projected/cc2a000c-f169-4622-8c82-cd4c2baa730a-kube-api-access-dwm7q\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.905278 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-utilities\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.905498 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-catalog-content\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.910800 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.923972 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwm7q\" (UniqueName: \"kubernetes.io/projected/cc2a000c-f169-4622-8c82-cd4c2baa730a-kube-api-access-dwm7q\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.013512 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.044946 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.046301 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.055571 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.060858 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.332295 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d780ba2-9829-430e-9a56-0b5b052bfbb7" path="/var/lib/kubelet/pods/8d780ba2-9829-430e-9a56-0b5b052bfbb7/volumes" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.443420 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:49 crc kubenswrapper[4781]: W0227 00:29:49.448431 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3b399a8_7654_47f3_be04_759080f4f180.slice/crio-756fbef35db868ba808ee405957034dabe2c897cd150b7acda8df14ae20dd8f7 WatchSource:0}: Error finding container 756fbef35db868ba808ee405957034dabe2c897cd150b7acda8df14ae20dd8f7: Status 404 returned error can't find the container with id 756fbef35db868ba808ee405957034dabe2c897cd150b7acda8df14ae20dd8f7 Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.551843 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a3b399a8-7654-47f3-be04-759080f4f180","Type":"ContainerStarted","Data":"756fbef35db868ba808ee405957034dabe2c897cd150b7acda8df14ae20dd8f7"} Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.551887 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.564881 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.623959 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-49thr"] Feb 27 00:29:49 crc kubenswrapper[4781]: W0227 00:29:49.624240 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc2a000c_f169_4622_8c82_cd4c2baa730a.slice/crio-91f3015c87e164a4f8b695fde0fa0251c13f94e03e4150c65b9ff72adaa25616 WatchSource:0}: Error finding container 91f3015c87e164a4f8b695fde0fa0251c13f94e03e4150c65b9ff72adaa25616: Status 404 returned error can't find the container with id 91f3015c87e164a4f8b695fde0fa0251c13f94e03e4150c65b9ff72adaa25616 Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.743288 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-hm24r"] Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.753381 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.761435 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-hm24r"] Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.839559 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-config\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.839641 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.839670 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.839718 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwcnr\" (UniqueName: \"kubernetes.io/projected/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-kube-api-access-cwcnr\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.839765 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.839850 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.942159 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.942281 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-config\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.942308 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.942327 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.942361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwcnr\" (UniqueName: \"kubernetes.io/projected/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-kube-api-access-cwcnr\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.942398 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.943248 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.943840 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.944331 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-config\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.945129 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.945510 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.966227 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwcnr\" (UniqueName: \"kubernetes.io/projected/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-kube-api-access-cwcnr\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.094087 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.562398 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a3b399a8-7654-47f3-be04-759080f4f180","Type":"ContainerStarted","Data":"94ca6f18e3eef7f80d8723d209ec5d280431dc24efb36670ddfc4b66fb3e818e"} Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.565342 4781 generic.go:334] "Generic (PLEG): container finished" podID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerID="a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a" exitCode=0 Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.565554 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerDied","Data":"a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a"} Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.565599 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerStarted","Data":"91f3015c87e164a4f8b695fde0fa0251c13f94e03e4150c65b9ff72adaa25616"} Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.671374 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.671331943 podStartE2EDuration="2.671331943s" podCreationTimestamp="2026-02-27 00:29:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:50.58573778 +0000 UTC m=+1459.843277344" watchObservedRunningTime="2026-02-27 00:29:50.671331943 +0000 UTC m=+1459.928871487" Feb 27 00:29:50 crc kubenswrapper[4781]: W0227 00:29:50.727324 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8bd7379_6c29_4c0f_bb7e_14c18f98a18e.slice/crio-bc2ce3ec147ae0ea063d3bc5998697394125f3fd2381d07ae16cdf6df5227b71 WatchSource:0}: Error finding container bc2ce3ec147ae0ea063d3bc5998697394125f3fd2381d07ae16cdf6df5227b71: Status 404 returned error can't find the container with id bc2ce3ec147ae0ea063d3bc5998697394125f3fd2381d07ae16cdf6df5227b71 Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.774575 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-hm24r"] Feb 27 00:29:51 crc kubenswrapper[4781]: I0227 00:29:51.578993 4781 generic.go:334] "Generic (PLEG): container finished" podID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerID="0477def692642480b7baa681e79da18341ef273274b3570944d4f51dd3971947" exitCode=0 Feb 27 00:29:51 crc kubenswrapper[4781]: I0227 00:29:51.579198 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" event={"ID":"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e","Type":"ContainerDied","Data":"0477def692642480b7baa681e79da18341ef273274b3570944d4f51dd3971947"} Feb 27 00:29:51 crc kubenswrapper[4781]: I0227 00:29:51.580705 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" event={"ID":"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e","Type":"ContainerStarted","Data":"bc2ce3ec147ae0ea063d3bc5998697394125f3fd2381d07ae16cdf6df5227b71"} Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.590812 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" event={"ID":"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e","Type":"ContainerStarted","Data":"e59534a993c981832971b7ef17c8f1e9f9d24b23112b98369b8ccf0ba58923af"} Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.591978 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.592719 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerStarted","Data":"4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0"} Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.615012 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" podStartSLOduration=3.614992533 podStartE2EDuration="3.614992533s" podCreationTimestamp="2026-02-27 00:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:52.606159984 +0000 UTC m=+1461.863699548" watchObservedRunningTime="2026-02-27 00:29:52.614992533 +0000 UTC m=+1461.872532087" Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.761474 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.761824 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-central-agent" containerID="cri-o://3ddb72adfd8dadbe432eb551c304f261946dae5663273b00c2b5c6ab9ec5b0b1" gracePeriod=30 Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.762315 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="proxy-httpd" containerID="cri-o://78c52f488afaab989176f5c5ab096fb61a4e74a72dc5d52ce83048b14f67d902" gracePeriod=30 Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.762390 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="sg-core" containerID="cri-o://fd9909df11f574e0138a430f34c72bf18ace2c57464e54425e45df0b7fd14f75" gracePeriod=30 Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.762442 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-notification-agent" containerID="cri-o://9a609615b1e77c141503575f4b85bd73b7b9605cdd075e949757163fb3230f19" gracePeriod=30 Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.605225 4781 generic.go:334] "Generic (PLEG): container finished" podID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerID="78c52f488afaab989176f5c5ab096fb61a4e74a72dc5d52ce83048b14f67d902" exitCode=0 Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.605464 4781 generic.go:334] "Generic (PLEG): container finished" podID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerID="fd9909df11f574e0138a430f34c72bf18ace2c57464e54425e45df0b7fd14f75" exitCode=2 Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.605471 4781 generic.go:334] "Generic (PLEG): container finished" podID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerID="3ddb72adfd8dadbe432eb551c304f261946dae5663273b00c2b5c6ab9ec5b0b1" exitCode=0 Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.605272 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerDied","Data":"78c52f488afaab989176f5c5ab096fb61a4e74a72dc5d52ce83048b14f67d902"} Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.605551 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerDied","Data":"fd9909df11f574e0138a430f34c72bf18ace2c57464e54425e45df0b7fd14f75"} Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.605562 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerDied","Data":"3ddb72adfd8dadbe432eb551c304f261946dae5663273b00c2b5c6ab9ec5b0b1"} Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.614217 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.614511 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-log" containerID="cri-o://45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922" gracePeriod=30 Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.614708 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-api" containerID="cri-o://b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d" gracePeriod=30 Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.912173 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:54 crc kubenswrapper[4781]: I0227 00:29:54.615145 4781 generic.go:334] "Generic (PLEG): container finished" podID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerID="45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922" exitCode=143 Feb 27 00:29:54 crc kubenswrapper[4781]: I0227 00:29:54.615210 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e57ffac0-932b-42fd-bc09-ae357b25eeb1","Type":"ContainerDied","Data":"45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922"} Feb 27 00:29:56 crc kubenswrapper[4781]: I0227 00:29:56.642024 4781 generic.go:334] "Generic (PLEG): container finished" podID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerID="9a609615b1e77c141503575f4b85bd73b7b9605cdd075e949757163fb3230f19" exitCode=0 Feb 27 00:29:56 crc kubenswrapper[4781]: I0227 00:29:56.642095 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerDied","Data":"9a609615b1e77c141503575f4b85bd73b7b9605cdd075e949757163fb3230f19"} Feb 27 00:29:56 crc kubenswrapper[4781]: I0227 00:29:56.998416 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099374 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-combined-ca-bundle\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099457 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-config-data\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099599 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-sg-core-conf-yaml\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099668 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-scripts\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099761 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-run-httpd\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099809 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-log-httpd\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099948 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tthpk\" (UniqueName: \"kubernetes.io/projected/7825d67f-c124-4ee9-9e74-32c35c4370c0-kube-api-access-tthpk\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.101025 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.101479 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.104915 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-scripts" (OuterVolumeSpecName: "scripts") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.118944 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7825d67f-c124-4ee9-9e74-32c35c4370c0-kube-api-access-tthpk" (OuterVolumeSpecName: "kube-api-access-tthpk") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "kube-api-access-tthpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.154814 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.201854 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tthpk\" (UniqueName: \"kubernetes.io/projected/7825d67f-c124-4ee9-9e74-32c35c4370c0-kube-api-access-tthpk\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.201962 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.202014 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.202074 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.202130 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.202300 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.236530 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-config-data" (OuterVolumeSpecName: "config-data") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.260543 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.303084 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-combined-ca-bundle\") pod \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.303236 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-config-data\") pod \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.303310 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ffac0-932b-42fd-bc09-ae357b25eeb1-logs\") pod \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.303349 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p577n\" (UniqueName: \"kubernetes.io/projected/e57ffac0-932b-42fd-bc09-ae357b25eeb1-kube-api-access-p577n\") pod \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.304003 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.304044 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.304483 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e57ffac0-932b-42fd-bc09-ae357b25eeb1-logs" (OuterVolumeSpecName: "logs") pod "e57ffac0-932b-42fd-bc09-ae357b25eeb1" (UID: "e57ffac0-932b-42fd-bc09-ae357b25eeb1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.312317 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e57ffac0-932b-42fd-bc09-ae357b25eeb1-kube-api-access-p577n" (OuterVolumeSpecName: "kube-api-access-p577n") pod "e57ffac0-932b-42fd-bc09-ae357b25eeb1" (UID: "e57ffac0-932b-42fd-bc09-ae357b25eeb1"). InnerVolumeSpecName "kube-api-access-p577n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.331643 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-config-data" (OuterVolumeSpecName: "config-data") pod "e57ffac0-932b-42fd-bc09-ae357b25eeb1" (UID: "e57ffac0-932b-42fd-bc09-ae357b25eeb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.359502 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e57ffac0-932b-42fd-bc09-ae357b25eeb1" (UID: "e57ffac0-932b-42fd-bc09-ae357b25eeb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.405574 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.405604 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ffac0-932b-42fd-bc09-ae357b25eeb1-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.405616 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p577n\" (UniqueName: \"kubernetes.io/projected/e57ffac0-932b-42fd-bc09-ae357b25eeb1-kube-api-access-p577n\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.405641 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.612381 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7825d67f_c124_4ee9_9e74_32c35c4370c0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7825d67f_c124_4ee9_9e74_32c35c4370c0.slice/crio-a90b7ced7061699d62e894c9b3b31c21fe93acf06b438953563f0da53923c22d\": RecentStats: unable to find data in memory cache]" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.654199 4781 generic.go:334] "Generic (PLEG): container finished" podID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerID="4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0" exitCode=0 Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.654287 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerDied","Data":"4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0"} Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.657072 4781 generic.go:334] "Generic (PLEG): container finished" podID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerID="b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d" exitCode=0 Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.657126 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e57ffac0-932b-42fd-bc09-ae357b25eeb1","Type":"ContainerDied","Data":"b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d"} Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.657158 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e57ffac0-932b-42fd-bc09-ae357b25eeb1","Type":"ContainerDied","Data":"52f4435171fea776734e465dadfa7d220c142ef75d0364376751af62a2757023"} Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.657170 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.657176 4781 scope.go:117] "RemoveContainer" containerID="b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.661650 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerDied","Data":"a90b7ced7061699d62e894c9b3b31c21fe93acf06b438953563f0da53923c22d"} Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.661724 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.696619 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.709166 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.711024 4781 scope.go:117] "RemoveContainer" containerID="45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.721671 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.738944 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.750357 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.751271 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-notification-agent" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.751378 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-notification-agent" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.751519 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-central-agent" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.751634 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-central-agent" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.751740 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-api" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.751830 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-api" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.751923 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="sg-core" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.752011 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="sg-core" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.752105 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="proxy-httpd" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.752203 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="proxy-httpd" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.752315 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-log" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.752405 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-log" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.752843 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="sg-core" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.752972 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="proxy-httpd" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.753072 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-central-agent" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.753160 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-api" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.753275 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-notification-agent" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.753351 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-log" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.756476 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.761429 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.761511 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.763931 4781 scope.go:117] "RemoveContainer" containerID="b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.767396 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d\": container with ID starting with b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d not found: ID does not exist" containerID="b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.767540 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d"} err="failed to get container status \"b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d\": rpc error: code = NotFound desc = could not find container \"b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d\": container with ID starting with b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d not found: ID does not exist" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.767660 4781 scope.go:117] "RemoveContainer" containerID="45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.768075 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922\": container with ID starting with 45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922 not found: ID does not exist" containerID="45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.768175 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922"} err="failed to get container status \"45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922\": rpc error: code = NotFound desc = could not find container \"45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922\": container with ID starting with 45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922 not found: ID does not exist" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.768267 4781 scope.go:117] "RemoveContainer" containerID="78c52f488afaab989176f5c5ab096fb61a4e74a72dc5d52ce83048b14f67d902" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.766836 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.780843 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.782601 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.787904 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.788216 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.788413 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.796419 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.814044 4781 scope.go:117] "RemoveContainer" containerID="fd9909df11f574e0138a430f34c72bf18ace2c57464e54425e45df0b7fd14f75" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-run-httpd\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819376 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-config-data\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819399 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819420 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-config-data\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819433 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-log-httpd\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819450 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecbcf96-0260-4e87-afe5-9acc6098ec59-logs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819753 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-public-tls-certs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819824 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmfqp\" (UniqueName: \"kubernetes.io/projected/0ecbcf96-0260-4e87-afe5-9acc6098ec59-kube-api-access-vmfqp\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819865 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.820041 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5spl\" (UniqueName: \"kubernetes.io/projected/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-kube-api-access-b5spl\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.820169 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-scripts\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.820248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.820279 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.833617 4781 scope.go:117] "RemoveContainer" containerID="9a609615b1e77c141503575f4b85bd73b7b9605cdd075e949757163fb3230f19" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.854068 4781 scope.go:117] "RemoveContainer" containerID="3ddb72adfd8dadbe432eb551c304f261946dae5663273b00c2b5c6ab9ec5b0b1" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922344 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-public-tls-certs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922401 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmfqp\" (UniqueName: \"kubernetes.io/projected/0ecbcf96-0260-4e87-afe5-9acc6098ec59-kube-api-access-vmfqp\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922432 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922524 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5spl\" (UniqueName: \"kubernetes.io/projected/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-kube-api-access-b5spl\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922561 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-scripts\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922584 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922605 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922659 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-run-httpd\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922688 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-config-data\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922712 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922739 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-config-data\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922759 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-log-httpd\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922782 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecbcf96-0260-4e87-afe5-9acc6098ec59-logs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.923230 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecbcf96-0260-4e87-afe5-9acc6098ec59-logs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.923581 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-run-httpd\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.923896 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-log-httpd\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.928266 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-public-tls-certs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.928852 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.929218 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.929371 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-config-data\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.930243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.934621 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.934729 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-scripts\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.940515 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-config-data\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.940504 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5spl\" (UniqueName: \"kubernetes.io/projected/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-kube-api-access-b5spl\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.943636 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmfqp\" (UniqueName: \"kubernetes.io/projected/0ecbcf96-0260-4e87-afe5-9acc6098ec59-kube-api-access-vmfqp\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.110826 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.116338 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.414054 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.414715 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" containerName="kube-state-metrics" containerID="cri-o://59ed5bb57f5c002905a336da46ce8019d8424d181ddbd01fd683c6c25bea9d90" gracePeriod=30 Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.619023 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.679029 4781 generic.go:334] "Generic (PLEG): container finished" podID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" containerID="59ed5bb57f5c002905a336da46ce8019d8424d181ddbd01fd683c6c25bea9d90" exitCode=2 Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.679105 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"91997a3e-9e65-4eab-a0b9-8f9c639a8d05","Type":"ContainerDied","Data":"59ed5bb57f5c002905a336da46ce8019d8424d181ddbd01fd683c6c25bea9d90"} Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.685162 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerStarted","Data":"2c4686839ed88b8a45c07bbc45e5d7e8f95577bd88f8f5ed02b133c4326a106e"} Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.692951 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerStarted","Data":"d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344"} Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.720278 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-49thr" podStartSLOduration=3.212644276 podStartE2EDuration="10.720259104s" podCreationTimestamp="2026-02-27 00:29:48 +0000 UTC" firstStartedPulling="2026-02-27 00:29:50.573078578 +0000 UTC m=+1459.830618122" lastFinishedPulling="2026-02-27 00:29:58.080693386 +0000 UTC m=+1467.338232950" observedRunningTime="2026-02-27 00:29:58.709088422 +0000 UTC m=+1467.966627976" watchObservedRunningTime="2026-02-27 00:29:58.720259104 +0000 UTC m=+1467.977798658" Feb 27 00:29:58 crc kubenswrapper[4781]: W0227 00:29:58.733304 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ecbcf96_0260_4e87_afe5_9acc6098ec59.slice/crio-b6418bc15cd13e8d0bd6da20db363f8f00d774a0d59f04ab4104216089ac4cbd WatchSource:0}: Error finding container b6418bc15cd13e8d0bd6da20db363f8f00d774a0d59f04ab4104216089ac4cbd: Status 404 returned error can't find the container with id b6418bc15cd13e8d0bd6da20db363f8f00d774a0d59f04ab4104216089ac4cbd Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.738373 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.911448 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.941485 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.013954 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.013995 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.030521 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.048679 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr9c5\" (UniqueName: \"kubernetes.io/projected/91997a3e-9e65-4eab-a0b9-8f9c639a8d05-kube-api-access-hr9c5\") pod \"91997a3e-9e65-4eab-a0b9-8f9c639a8d05\" (UID: \"91997a3e-9e65-4eab-a0b9-8f9c639a8d05\") " Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.053957 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91997a3e-9e65-4eab-a0b9-8f9c639a8d05-kube-api-access-hr9c5" (OuterVolumeSpecName: "kube-api-access-hr9c5") pod "91997a3e-9e65-4eab-a0b9-8f9c639a8d05" (UID: "91997a3e-9e65-4eab-a0b9-8f9c639a8d05"). InnerVolumeSpecName "kube-api-access-hr9c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.153055 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr9c5\" (UniqueName: \"kubernetes.io/projected/91997a3e-9e65-4eab-a0b9-8f9c639a8d05-kube-api-access-hr9c5\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.319877 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" path="/var/lib/kubelet/pods/7825d67f-c124-4ee9-9e74-32c35c4370c0/volumes" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.320610 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" path="/var/lib/kubelet/pods/e57ffac0-932b-42fd-bc09-ae357b25eeb1/volumes" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.736374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecbcf96-0260-4e87-afe5-9acc6098ec59","Type":"ContainerStarted","Data":"4fb1191f2a33c534b5ff081b75c12582b7d083d0914703fd100fe77a443ffaca"} Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.736931 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecbcf96-0260-4e87-afe5-9acc6098ec59","Type":"ContainerStarted","Data":"a1f456672f54264ca1c600a8f933a178f9e17b564318587aad61468738ce8718"} Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.736949 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecbcf96-0260-4e87-afe5-9acc6098ec59","Type":"ContainerStarted","Data":"b6418bc15cd13e8d0bd6da20db363f8f00d774a0d59f04ab4104216089ac4cbd"} Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.743027 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"91997a3e-9e65-4eab-a0b9-8f9c639a8d05","Type":"ContainerDied","Data":"40455368adae6ae08a72863a733dea4cab1b575394a61b8c7f6b3e11518f1446"} Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.743080 4781 scope.go:117] "RemoveContainer" containerID="59ed5bb57f5c002905a336da46ce8019d8424d181ddbd01fd683c6c25bea9d90" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.743231 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.747052 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerStarted","Data":"6aee62b4a0e99b93b70f6483a48731232b6e0bac4ac727deafa8d7b77d1fa9e5"} Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.773918 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.773900059 podStartE2EDuration="2.773900059s" podCreationTimestamp="2026-02-27 00:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:59.759941952 +0000 UTC m=+1469.017481516" watchObservedRunningTime="2026-02-27 00:29:59.773900059 +0000 UTC m=+1469.031439613" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.781661 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.910175 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.949425 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.960256 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:29:59 crc kubenswrapper[4781]: E0227 00:29:59.962671 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" containerName="kube-state-metrics" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.962718 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" containerName="kube-state-metrics" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.963964 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" containerName="kube-state-metrics" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.968232 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.971014 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.971394 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.973099 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.045959 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6twxl"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.048200 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.051748 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.058935 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.060182 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6twxl"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.065525 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-49thr" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" probeResult="failure" output=< Feb 27 00:30:00 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:30:00 crc kubenswrapper[4781]: > Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.083384 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.083500 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bp4c\" (UniqueName: \"kubernetes.io/projected/25933928-b136-4b38-955a-46a3d802a62b-kube-api-access-9bp4c\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.083527 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.083648 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.096817 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.153078 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535870-pjv2c"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.155028 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.157339 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.157655 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.162134 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185146 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-config-data\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185221 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185283 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185312 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-scripts\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185369 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpxdl\" (UniqueName: \"kubernetes.io/projected/27b0d2a5-5629-42a0-8884-a5534240b356-kube-api-access-xpxdl\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185433 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bp4c\" (UniqueName: \"kubernetes.io/projected/25933928-b136-4b38-955a-46a3d802a62b-kube-api-access-9bp4c\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185466 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185588 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.194577 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.194763 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.196398 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.196672 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.198266 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.199873 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.200090 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.218179 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535870-pjv2c"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.220795 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bp4c\" (UniqueName: \"kubernetes.io/projected/25933928-b136-4b38-955a-46a3d802a62b-kube-api-access-9bp4c\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.233976 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.251292 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-l4cw7"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.251585 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="dnsmasq-dns" containerID="cri-o://26d79208d95dcfd480e6dcf5e635ea74d70976218b9d0db2771a4aca513d9249" gracePeriod=10 Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.288028 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-config-data\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.288244 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.288459 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-scripts\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.288528 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqmlt\" (UniqueName: \"kubernetes.io/projected/b8ec74af-d604-42ac-83bb-db047e8d8506-kube-api-access-jqmlt\") pod \"auto-csr-approver-29535870-pjv2c\" (UID: \"b8ec74af-d604-42ac-83bb-db047e8d8506\") " pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.288647 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpxdl\" (UniqueName: \"kubernetes.io/projected/27b0d2a5-5629-42a0-8884-a5534240b356-kube-api-access-xpxdl\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.299035 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-scripts\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.299237 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.299549 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.299734 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-config-data\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.322580 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpxdl\" (UniqueName: \"kubernetes.io/projected/27b0d2a5-5629-42a0-8884-a5534240b356-kube-api-access-xpxdl\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.367785 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.390977 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb78ed91-75d4-40d9-9359-da1c3878e145-config-volume\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.391362 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4rc4\" (UniqueName: \"kubernetes.io/projected/eb78ed91-75d4-40d9-9359-da1c3878e145-kube-api-access-l4rc4\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.391399 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb78ed91-75d4-40d9-9359-da1c3878e145-secret-volume\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.391435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqmlt\" (UniqueName: \"kubernetes.io/projected/b8ec74af-d604-42ac-83bb-db047e8d8506-kube-api-access-jqmlt\") pod \"auto-csr-approver-29535870-pjv2c\" (UID: \"b8ec74af-d604-42ac-83bb-db047e8d8506\") " pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.415308 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqmlt\" (UniqueName: \"kubernetes.io/projected/b8ec74af-d604-42ac-83bb-db047e8d8506-kube-api-access-jqmlt\") pod \"auto-csr-approver-29535870-pjv2c\" (UID: \"b8ec74af-d604-42ac-83bb-db047e8d8506\") " pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.493342 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.495638 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb78ed91-75d4-40d9-9359-da1c3878e145-config-volume\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.495752 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4rc4\" (UniqueName: \"kubernetes.io/projected/eb78ed91-75d4-40d9-9359-da1c3878e145-kube-api-access-l4rc4\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.495798 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb78ed91-75d4-40d9-9359-da1c3878e145-secret-volume\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.498531 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb78ed91-75d4-40d9-9359-da1c3878e145-config-volume\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.500573 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb78ed91-75d4-40d9-9359-da1c3878e145-secret-volume\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.525380 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4rc4\" (UniqueName: \"kubernetes.io/projected/eb78ed91-75d4-40d9-9359-da1c3878e145-kube-api-access-l4rc4\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.643600 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.786913 4781 generic.go:334] "Generic (PLEG): container finished" podID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerID="26d79208d95dcfd480e6dcf5e635ea74d70976218b9d0db2771a4aca513d9249" exitCode=0 Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.787193 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" event={"ID":"5f47f2d5-f4d5-448d-9355-ebe37959b584","Type":"ContainerDied","Data":"26d79208d95dcfd480e6dcf5e635ea74d70976218b9d0db2771a4aca513d9249"} Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.825818 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerStarted","Data":"7b717e2de43421aa8594d897ad2625a4ab8b03819b1fddaf68c17d412b495f36"} Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.944075 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.031558 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-nb\") pod \"5f47f2d5-f4d5-448d-9355-ebe37959b584\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.031763 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-svc\") pod \"5f47f2d5-f4d5-448d-9355-ebe37959b584\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.031790 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-swift-storage-0\") pod \"5f47f2d5-f4d5-448d-9355-ebe37959b584\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.031877 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-config\") pod \"5f47f2d5-f4d5-448d-9355-ebe37959b584\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.031975 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-sb\") pod \"5f47f2d5-f4d5-448d-9355-ebe37959b584\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.032027 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9vk6\" (UniqueName: \"kubernetes.io/projected/5f47f2d5-f4d5-448d-9355-ebe37959b584-kube-api-access-f9vk6\") pod \"5f47f2d5-f4d5-448d-9355-ebe37959b584\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.049297 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f47f2d5-f4d5-448d-9355-ebe37959b584-kube-api-access-f9vk6" (OuterVolumeSpecName: "kube-api-access-f9vk6") pod "5f47f2d5-f4d5-448d-9355-ebe37959b584" (UID: "5f47f2d5-f4d5-448d-9355-ebe37959b584"). InnerVolumeSpecName "kube-api-access-f9vk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.128868 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5f47f2d5-f4d5-448d-9355-ebe37959b584" (UID: "5f47f2d5-f4d5-448d-9355-ebe37959b584"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.129412 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5f47f2d5-f4d5-448d-9355-ebe37959b584" (UID: "5f47f2d5-f4d5-448d-9355-ebe37959b584"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.135571 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.135606 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.135618 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9vk6\" (UniqueName: \"kubernetes.io/projected/5f47f2d5-f4d5-448d-9355-ebe37959b584-kube-api-access-f9vk6\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.140170 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5f47f2d5-f4d5-448d-9355-ebe37959b584" (UID: "5f47f2d5-f4d5-448d-9355-ebe37959b584"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.141563 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5f47f2d5-f4d5-448d-9355-ebe37959b584" (UID: "5f47f2d5-f4d5-448d-9355-ebe37959b584"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.176453 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-config" (OuterVolumeSpecName: "config") pod "5f47f2d5-f4d5-448d-9355-ebe37959b584" (UID: "5f47f2d5-f4d5-448d-9355-ebe37959b584"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.200319 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.237546 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.237584 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.237606 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.404855 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" path="/var/lib/kubelet/pods/91997a3e-9e65-4eab-a0b9-8f9c639a8d05/volumes" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.437761 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6twxl"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.449719 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.468698 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535870-pjv2c"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.724913 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.836009 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6twxl" event={"ID":"27b0d2a5-5629-42a0-8884-a5534240b356","Type":"ContainerStarted","Data":"603be41f44dabcefd367f03b819f0e12526431539cc454d1e0a0fbbe4c354d4e"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.836051 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6twxl" event={"ID":"27b0d2a5-5629-42a0-8884-a5534240b356","Type":"ContainerStarted","Data":"2e4718337e97959ef32ed9d78c1825b06db8a7a61e70b8f1c8473596ad38ebed"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.841838 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerStarted","Data":"fe5f026af9d91c7231233bd71a19d73aa9a872a6aadf224c6825ac237ba54fe7"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.850976 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" event={"ID":"5f47f2d5-f4d5-448d-9355-ebe37959b584","Type":"ContainerDied","Data":"e0e61b6d097a768cedf938a2051e02fe6b26d59774f1dfea50ad4f92d0779d0a"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.851031 4781 scope.go:117] "RemoveContainer" containerID="26d79208d95dcfd480e6dcf5e635ea74d70976218b9d0db2771a4aca513d9249" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.851186 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.854681 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6twxl" podStartSLOduration=1.854669254 podStartE2EDuration="1.854669254s" podCreationTimestamp="2026-02-27 00:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:30:01.853594105 +0000 UTC m=+1471.111133649" watchObservedRunningTime="2026-02-27 00:30:01.854669254 +0000 UTC m=+1471.112208808" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.856121 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" event={"ID":"b8ec74af-d604-42ac-83bb-db047e8d8506","Type":"ContainerStarted","Data":"ac4590128cbde68a4a47c1669a269597aaf13fb7275d8a433b383eae02651ba0"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.859645 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"25933928-b136-4b38-955a-46a3d802a62b","Type":"ContainerStarted","Data":"7b47bd884aeaea9a9ce08ffea52c5f963733140e3aa935098889fe9feffbc5ef"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.863042 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" event={"ID":"eb78ed91-75d4-40d9-9359-da1c3878e145","Type":"ContainerStarted","Data":"5dd01220d04e81eb4ff121788a86bb3729ff13c080727f1bce4d0bfcd72babda"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.893570 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-l4cw7"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.928043 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-l4cw7"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.933190 4781 scope.go:117] "RemoveContainer" containerID="363437972dc1edd0a85fa61204497c017a7b8e034221df5e68a301f8138ef7f7" Feb 27 00:30:02 crc kubenswrapper[4781]: I0227 00:30:02.874213 4781 generic.go:334] "Generic (PLEG): container finished" podID="eb78ed91-75d4-40d9-9359-da1c3878e145" containerID="d91a97b2a127dcb363e0a68bf8507e044d643d2c3b09f879675dfcd44d75afab" exitCode=0 Feb 27 00:30:02 crc kubenswrapper[4781]: I0227 00:30:02.874314 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" event={"ID":"eb78ed91-75d4-40d9-9359-da1c3878e145","Type":"ContainerDied","Data":"d91a97b2a127dcb363e0a68bf8507e044d643d2c3b09f879675dfcd44d75afab"} Feb 27 00:30:02 crc kubenswrapper[4781]: I0227 00:30:02.878082 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"25933928-b136-4b38-955a-46a3d802a62b","Type":"ContainerStarted","Data":"1dc221fdcf5400feadbccc6ac0f50f82af7a44c7d54118a608923e78070e2715"} Feb 27 00:30:02 crc kubenswrapper[4781]: I0227 00:30:02.878302 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 27 00:30:02 crc kubenswrapper[4781]: I0227 00:30:02.913480 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.37756933 podStartE2EDuration="3.913460488s" podCreationTimestamp="2026-02-27 00:29:59 +0000 UTC" firstStartedPulling="2026-02-27 00:30:01.395433647 +0000 UTC m=+1470.652973201" lastFinishedPulling="2026-02-27 00:30:01.931324805 +0000 UTC m=+1471.188864359" observedRunningTime="2026-02-27 00:30:02.903852498 +0000 UTC m=+1472.161392082" watchObservedRunningTime="2026-02-27 00:30:02.913460488 +0000 UTC m=+1472.171000042" Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.344614 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" path="/var/lib/kubelet/pods/5f47f2d5-f4d5-448d-9355-ebe37959b584/volumes" Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.897924 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerStarted","Data":"12384cd6d81bca6e23791de7f51204e92fef8abc9d7b3fd6d25b23d6563fe9c1"} Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.897919 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-central-agent" containerID="cri-o://6aee62b4a0e99b93b70f6483a48731232b6e0bac4ac727deafa8d7b77d1fa9e5" gracePeriod=30 Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.897966 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="proxy-httpd" containerID="cri-o://12384cd6d81bca6e23791de7f51204e92fef8abc9d7b3fd6d25b23d6563fe9c1" gracePeriod=30 Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.897976 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="sg-core" containerID="cri-o://fe5f026af9d91c7231233bd71a19d73aa9a872a6aadf224c6825ac237ba54fe7" gracePeriod=30 Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.898430 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.897986 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-notification-agent" containerID="cri-o://7b717e2de43421aa8594d897ad2625a4ab8b03819b1fddaf68c17d412b495f36" gracePeriod=30 Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.934045 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.335436315 podStartE2EDuration="6.93402468s" podCreationTimestamp="2026-02-27 00:29:57 +0000 UTC" firstStartedPulling="2026-02-27 00:29:58.627340134 +0000 UTC m=+1467.884879688" lastFinishedPulling="2026-02-27 00:30:03.225928499 +0000 UTC m=+1472.483468053" observedRunningTime="2026-02-27 00:30:03.927783811 +0000 UTC m=+1473.185323365" watchObservedRunningTime="2026-02-27 00:30:03.93402468 +0000 UTC m=+1473.191564234" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.496681 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.584374 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb78ed91-75d4-40d9-9359-da1c3878e145-secret-volume\") pod \"eb78ed91-75d4-40d9-9359-da1c3878e145\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.584707 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4rc4\" (UniqueName: \"kubernetes.io/projected/eb78ed91-75d4-40d9-9359-da1c3878e145-kube-api-access-l4rc4\") pod \"eb78ed91-75d4-40d9-9359-da1c3878e145\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.584772 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb78ed91-75d4-40d9-9359-da1c3878e145-config-volume\") pod \"eb78ed91-75d4-40d9-9359-da1c3878e145\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.585426 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb78ed91-75d4-40d9-9359-da1c3878e145-config-volume" (OuterVolumeSpecName: "config-volume") pod "eb78ed91-75d4-40d9-9359-da1c3878e145" (UID: "eb78ed91-75d4-40d9-9359-da1c3878e145"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.590712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb78ed91-75d4-40d9-9359-da1c3878e145-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eb78ed91-75d4-40d9-9359-da1c3878e145" (UID: "eb78ed91-75d4-40d9-9359-da1c3878e145"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.606903 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb78ed91-75d4-40d9-9359-da1c3878e145-kube-api-access-l4rc4" (OuterVolumeSpecName: "kube-api-access-l4rc4") pod "eb78ed91-75d4-40d9-9359-da1c3878e145" (UID: "eb78ed91-75d4-40d9-9359-da1c3878e145"). InnerVolumeSpecName "kube-api-access-l4rc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.687025 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4rc4\" (UniqueName: \"kubernetes.io/projected/eb78ed91-75d4-40d9-9359-da1c3878e145-kube-api-access-l4rc4\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.687230 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb78ed91-75d4-40d9-9359-da1c3878e145-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.687328 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb78ed91-75d4-40d9-9359-da1c3878e145-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.908981 4781 generic.go:334] "Generic (PLEG): container finished" podID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerID="fe5f026af9d91c7231233bd71a19d73aa9a872a6aadf224c6825ac237ba54fe7" exitCode=2 Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.909317 4781 generic.go:334] "Generic (PLEG): container finished" podID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerID="7b717e2de43421aa8594d897ad2625a4ab8b03819b1fddaf68c17d412b495f36" exitCode=0 Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.909050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerDied","Data":"fe5f026af9d91c7231233bd71a19d73aa9a872a6aadf224c6825ac237ba54fe7"} Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.909379 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerDied","Data":"7b717e2de43421aa8594d897ad2625a4ab8b03819b1fddaf68c17d412b495f36"} Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.910990 4781 generic.go:334] "Generic (PLEG): container finished" podID="b8ec74af-d604-42ac-83bb-db047e8d8506" containerID="2520db6bdce6e0291f097369119b25f716226e74f321fc28345a81a9140017c8" exitCode=0 Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.911065 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" event={"ID":"b8ec74af-d604-42ac-83bb-db047e8d8506","Type":"ContainerDied","Data":"2520db6bdce6e0291f097369119b25f716226e74f321fc28345a81a9140017c8"} Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.912408 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" event={"ID":"eb78ed91-75d4-40d9-9359-da1c3878e145","Type":"ContainerDied","Data":"5dd01220d04e81eb4ff121788a86bb3729ff13c080727f1bce4d0bfcd72babda"} Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.912430 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dd01220d04e81eb4ff121788a86bb3729ff13c080727f1bce4d0bfcd72babda" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.912484 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:05 crc kubenswrapper[4781]: I0227 00:30:05.618066 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.221:5353: i/o timeout" Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.391363 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.425231 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqmlt\" (UniqueName: \"kubernetes.io/projected/b8ec74af-d604-42ac-83bb-db047e8d8506-kube-api-access-jqmlt\") pod \"b8ec74af-d604-42ac-83bb-db047e8d8506\" (UID: \"b8ec74af-d604-42ac-83bb-db047e8d8506\") " Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.434408 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ec74af-d604-42ac-83bb-db047e8d8506-kube-api-access-jqmlt" (OuterVolumeSpecName: "kube-api-access-jqmlt") pod "b8ec74af-d604-42ac-83bb-db047e8d8506" (UID: "b8ec74af-d604-42ac-83bb-db047e8d8506"). InnerVolumeSpecName "kube-api-access-jqmlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.530558 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqmlt\" (UniqueName: \"kubernetes.io/projected/b8ec74af-d604-42ac-83bb-db047e8d8506-kube-api-access-jqmlt\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.933038 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" event={"ID":"b8ec74af-d604-42ac-83bb-db047e8d8506","Type":"ContainerDied","Data":"ac4590128cbde68a4a47c1669a269597aaf13fb7275d8a433b383eae02651ba0"} Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.933452 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac4590128cbde68a4a47c1669a269597aaf13fb7275d8a433b383eae02651ba0" Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.933122 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:07 crc kubenswrapper[4781]: I0227 00:30:07.480586 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535864-cfd4d"] Feb 27 00:30:07 crc kubenswrapper[4781]: I0227 00:30:07.495786 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535864-cfd4d"] Feb 27 00:30:07 crc kubenswrapper[4781]: I0227 00:30:07.943974 4781 generic.go:334] "Generic (PLEG): container finished" podID="27b0d2a5-5629-42a0-8884-a5534240b356" containerID="603be41f44dabcefd367f03b819f0e12526431539cc454d1e0a0fbbe4c354d4e" exitCode=0 Feb 27 00:30:07 crc kubenswrapper[4781]: I0227 00:30:07.944069 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6twxl" event={"ID":"27b0d2a5-5629-42a0-8884-a5534240b356","Type":"ContainerDied","Data":"603be41f44dabcefd367f03b819f0e12526431539cc454d1e0a0fbbe4c354d4e"} Feb 27 00:30:07 crc kubenswrapper[4781]: I0227 00:30:07.946949 4781 generic.go:334] "Generic (PLEG): container finished" podID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerID="6aee62b4a0e99b93b70f6483a48731232b6e0bac4ac727deafa8d7b77d1fa9e5" exitCode=0 Feb 27 00:30:07 crc kubenswrapper[4781]: I0227 00:30:07.946991 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerDied","Data":"6aee62b4a0e99b93b70f6483a48731232b6e0bac4ac727deafa8d7b77d1fa9e5"} Feb 27 00:30:08 crc kubenswrapper[4781]: I0227 00:30:08.116677 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:30:08 crc kubenswrapper[4781]: I0227 00:30:08.116726 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.133812 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.233:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.133825 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.233:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.326531 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9402a6e-66bb-4e1e-a33f-7fce411c83b8" path="/var/lib/kubelet/pods/b9402a6e-66bb-4e1e-a33f-7fce411c83b8/volumes" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.408288 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.485812 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-combined-ca-bundle\") pod \"27b0d2a5-5629-42a0-8884-a5534240b356\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.485977 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-scripts\") pod \"27b0d2a5-5629-42a0-8884-a5534240b356\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.486015 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpxdl\" (UniqueName: \"kubernetes.io/projected/27b0d2a5-5629-42a0-8884-a5534240b356-kube-api-access-xpxdl\") pod \"27b0d2a5-5629-42a0-8884-a5534240b356\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.486049 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-config-data\") pod \"27b0d2a5-5629-42a0-8884-a5534240b356\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.503788 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-scripts" (OuterVolumeSpecName: "scripts") pod "27b0d2a5-5629-42a0-8884-a5534240b356" (UID: "27b0d2a5-5629-42a0-8884-a5534240b356"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.503844 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27b0d2a5-5629-42a0-8884-a5534240b356-kube-api-access-xpxdl" (OuterVolumeSpecName: "kube-api-access-xpxdl") pod "27b0d2a5-5629-42a0-8884-a5534240b356" (UID: "27b0d2a5-5629-42a0-8884-a5534240b356"). InnerVolumeSpecName "kube-api-access-xpxdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.519905 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-config-data" (OuterVolumeSpecName: "config-data") pod "27b0d2a5-5629-42a0-8884-a5534240b356" (UID: "27b0d2a5-5629-42a0-8884-a5534240b356"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.519996 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27b0d2a5-5629-42a0-8884-a5534240b356" (UID: "27b0d2a5-5629-42a0-8884-a5534240b356"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.588868 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.588906 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.588918 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpxdl\" (UniqueName: \"kubernetes.io/projected/27b0d2a5-5629-42a0-8884-a5534240b356-kube-api-access-xpxdl\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.588931 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.972011 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6twxl" event={"ID":"27b0d2a5-5629-42a0-8884-a5534240b356","Type":"ContainerDied","Data":"2e4718337e97959ef32ed9d78c1825b06db8a7a61e70b8f1c8473596ad38ebed"} Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.972054 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e4718337e97959ef32ed9d78c1825b06db8a7a61e70b8f1c8473596ad38ebed" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.972070 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.062412 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-49thr" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" probeResult="failure" output=< Feb 27 00:30:10 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:30:10 crc kubenswrapper[4781]: > Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.163530 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.163767 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-log" containerID="cri-o://a1f456672f54264ca1c600a8f933a178f9e17b564318587aad61468738ce8718" gracePeriod=30 Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.163883 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-api" containerID="cri-o://4fb1191f2a33c534b5ff081b75c12582b7d083d0914703fd100fe77a443ffaca" gracePeriod=30 Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.185382 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.185715 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f5e01f6b-d306-41ac-9988-156063c5af7d" containerName="nova-scheduler-scheduler" containerID="cri-o://8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab" gracePeriod=30 Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.220939 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.221795 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-metadata" containerID="cri-o://8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf" gracePeriod=30 Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.225805 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-log" containerID="cri-o://a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9" gracePeriod=30 Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.332468 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 27 00:30:10 crc kubenswrapper[4781]: E0227 00:30:10.982668 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 00:30:10 crc kubenswrapper[4781]: E0227 00:30:10.988715 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.992147 4781 generic.go:334] "Generic (PLEG): container finished" podID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerID="a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9" exitCode=143 Feb 27 00:30:10 crc kubenswrapper[4781]: E0227 00:30:10.992160 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 00:30:10 crc kubenswrapper[4781]: E0227 00:30:10.992217 4781 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f5e01f6b-d306-41ac-9988-156063c5af7d" containerName="nova-scheduler-scheduler" Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.992227 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7524846b-772f-47a1-aaae-e7f29db2c0b5","Type":"ContainerDied","Data":"a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9"} Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.996818 4781 generic.go:334] "Generic (PLEG): container finished" podID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerID="a1f456672f54264ca1c600a8f933a178f9e17b564318587aad61468738ce8718" exitCode=143 Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.996861 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecbcf96-0260-4e87-afe5-9acc6098ec59","Type":"ContainerDied","Data":"a1f456672f54264ca1c600a8f933a178f9e17b564318587aad61468738ce8718"} Feb 27 00:30:12 crc kubenswrapper[4781]: I0227 00:30:12.895471 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:30:12 crc kubenswrapper[4781]: I0227 00:30:12.895875 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.378869 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": read tcp 10.217.0.2:48616->10.217.0.225:8775: read: connection reset by peer" Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.378958 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": read tcp 10.217.0.2:48614->10.217.0.225:8775: read: connection reset by peer" Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.906137 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.978777 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-combined-ca-bundle\") pod \"7524846b-772f-47a1-aaae-e7f29db2c0b5\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.978903 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-nova-metadata-tls-certs\") pod \"7524846b-772f-47a1-aaae-e7f29db2c0b5\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.978966 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-config-data\") pod \"7524846b-772f-47a1-aaae-e7f29db2c0b5\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.979043 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94rnf\" (UniqueName: \"kubernetes.io/projected/7524846b-772f-47a1-aaae-e7f29db2c0b5-kube-api-access-94rnf\") pod \"7524846b-772f-47a1-aaae-e7f29db2c0b5\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.979182 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7524846b-772f-47a1-aaae-e7f29db2c0b5-logs\") pod \"7524846b-772f-47a1-aaae-e7f29db2c0b5\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.980263 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7524846b-772f-47a1-aaae-e7f29db2c0b5-logs" (OuterVolumeSpecName: "logs") pod "7524846b-772f-47a1-aaae-e7f29db2c0b5" (UID: "7524846b-772f-47a1-aaae-e7f29db2c0b5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.988849 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7524846b-772f-47a1-aaae-e7f29db2c0b5-kube-api-access-94rnf" (OuterVolumeSpecName: "kube-api-access-94rnf") pod "7524846b-772f-47a1-aaae-e7f29db2c0b5" (UID: "7524846b-772f-47a1-aaae-e7f29db2c0b5"). InnerVolumeSpecName "kube-api-access-94rnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.014543 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7524846b-772f-47a1-aaae-e7f29db2c0b5" (UID: "7524846b-772f-47a1-aaae-e7f29db2c0b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.019389 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-config-data" (OuterVolumeSpecName: "config-data") pod "7524846b-772f-47a1-aaae-e7f29db2c0b5" (UID: "7524846b-772f-47a1-aaae-e7f29db2c0b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.037689 4781 generic.go:334] "Generic (PLEG): container finished" podID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerID="8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf" exitCode=0 Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.037728 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7524846b-772f-47a1-aaae-e7f29db2c0b5","Type":"ContainerDied","Data":"8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf"} Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.037754 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7524846b-772f-47a1-aaae-e7f29db2c0b5","Type":"ContainerDied","Data":"38c766ee39e3c7f63ba0e025dfac0a7e85784b006de99cf2ffb71128d62e3b91"} Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.037773 4781 scope.go:117] "RemoveContainer" containerID="8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.037912 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.044499 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7524846b-772f-47a1-aaae-e7f29db2c0b5" (UID: "7524846b-772f-47a1-aaae-e7f29db2c0b5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.116811 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7524846b-772f-47a1-aaae-e7f29db2c0b5-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.116845 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.116855 4781 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.116864 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.116872 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94rnf\" (UniqueName: \"kubernetes.io/projected/7524846b-772f-47a1-aaae-e7f29db2c0b5-kube-api-access-94rnf\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.139973 4781 scope.go:117] "RemoveContainer" containerID="a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.172666 4781 scope.go:117] "RemoveContainer" containerID="8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.173091 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf\": container with ID starting with 8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf not found: ID does not exist" containerID="8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.173120 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf"} err="failed to get container status \"8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf\": rpc error: code = NotFound desc = could not find container \"8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf\": container with ID starting with 8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf not found: ID does not exist" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.173142 4781 scope.go:117] "RemoveContainer" containerID="a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.173444 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9\": container with ID starting with a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9 not found: ID does not exist" containerID="a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.173462 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9"} err="failed to get container status \"a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9\": rpc error: code = NotFound desc = could not find container \"a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9\": container with ID starting with a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9 not found: ID does not exist" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.369642 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.380279 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.393024 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.393756 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="init" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.393846 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="init" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.393938 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-metadata" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.394044 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-metadata" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.394153 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b0d2a5-5629-42a0-8884-a5534240b356" containerName="nova-manage" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.394238 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b0d2a5-5629-42a0-8884-a5534240b356" containerName="nova-manage" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.394312 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb78ed91-75d4-40d9-9359-da1c3878e145" containerName="collect-profiles" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.394377 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb78ed91-75d4-40d9-9359-da1c3878e145" containerName="collect-profiles" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.394459 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-log" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.394549 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-log" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.394666 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="dnsmasq-dns" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.394748 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="dnsmasq-dns" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.394822 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ec74af-d604-42ac-83bb-db047e8d8506" containerName="oc" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.394884 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ec74af-d604-42ac-83bb-db047e8d8506" containerName="oc" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.395196 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="dnsmasq-dns" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.396765 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb78ed91-75d4-40d9-9359-da1c3878e145" containerName="collect-profiles" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.396851 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b0d2a5-5629-42a0-8884-a5534240b356" containerName="nova-manage" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.396865 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ec74af-d604-42ac-83bb-db047e8d8506" containerName="oc" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.396883 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-log" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.396895 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-metadata" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.398368 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.402534 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.402918 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.419001 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.527113 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-config-data\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.527169 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.527228 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e32c3573-4acb-4d70-aa6e-2d647c108931-logs\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.527287 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.527317 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmd4g\" (UniqueName: \"kubernetes.io/projected/e32c3573-4acb-4d70-aa6e-2d647c108931-kube-api-access-gmd4g\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.629540 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-config-data\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.629617 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.629728 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e32c3573-4acb-4d70-aa6e-2d647c108931-logs\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.629817 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.629855 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmd4g\" (UniqueName: \"kubernetes.io/projected/e32c3573-4acb-4d70-aa6e-2d647c108931-kube-api-access-gmd4g\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.630298 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e32c3573-4acb-4d70-aa6e-2d647c108931-logs\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.637344 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.637499 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-config-data\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.637539 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.656963 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmd4g\" (UniqueName: \"kubernetes.io/projected/e32c3573-4acb-4d70-aa6e-2d647c108931-kube-api-access-gmd4g\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.716264 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.048637 4781 generic.go:334] "Generic (PLEG): container finished" podID="f5e01f6b-d306-41ac-9988-156063c5af7d" containerID="8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab" exitCode=0 Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.048999 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5e01f6b-d306-41ac-9988-156063c5af7d","Type":"ContainerDied","Data":"8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab"} Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.052439 4781 generic.go:334] "Generic (PLEG): container finished" podID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerID="4fb1191f2a33c534b5ff081b75c12582b7d083d0914703fd100fe77a443ffaca" exitCode=0 Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.052473 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecbcf96-0260-4e87-afe5-9acc6098ec59","Type":"ContainerDied","Data":"4fb1191f2a33c534b5ff081b75c12582b7d083d0914703fd100fe77a443ffaca"} Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.160710 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.241613 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-public-tls-certs\") pod \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.241774 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecbcf96-0260-4e87-afe5-9acc6098ec59-logs\") pod \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.241828 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-config-data\") pod \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.241866 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-combined-ca-bundle\") pod \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.241944 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-internal-tls-certs\") pod \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.242015 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmfqp\" (UniqueName: \"kubernetes.io/projected/0ecbcf96-0260-4e87-afe5-9acc6098ec59-kube-api-access-vmfqp\") pod \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.243409 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ecbcf96-0260-4e87-afe5-9acc6098ec59-logs" (OuterVolumeSpecName: "logs") pod "0ecbcf96-0260-4e87-afe5-9acc6098ec59" (UID: "0ecbcf96-0260-4e87-afe5-9acc6098ec59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.246337 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ecbcf96-0260-4e87-afe5-9acc6098ec59-kube-api-access-vmfqp" (OuterVolumeSpecName: "kube-api-access-vmfqp") pod "0ecbcf96-0260-4e87-afe5-9acc6098ec59" (UID: "0ecbcf96-0260-4e87-afe5-9acc6098ec59"). InnerVolumeSpecName "kube-api-access-vmfqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.284415 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-config-data" (OuterVolumeSpecName: "config-data") pod "0ecbcf96-0260-4e87-afe5-9acc6098ec59" (UID: "0ecbcf96-0260-4e87-afe5-9acc6098ec59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.288519 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ecbcf96-0260-4e87-afe5-9acc6098ec59" (UID: "0ecbcf96-0260-4e87-afe5-9acc6098ec59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.313763 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0ecbcf96-0260-4e87-afe5-9acc6098ec59" (UID: "0ecbcf96-0260-4e87-afe5-9acc6098ec59"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.328259 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0ecbcf96-0260-4e87-afe5-9acc6098ec59" (UID: "0ecbcf96-0260-4e87-afe5-9acc6098ec59"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.330477 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" path="/var/lib/kubelet/pods/7524846b-772f-47a1-aaae-e7f29db2c0b5/volumes" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.353557 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecbcf96-0260-4e87-afe5-9acc6098ec59-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.353653 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.353666 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.353680 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.353698 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmfqp\" (UniqueName: \"kubernetes.io/projected/0ecbcf96-0260-4e87-afe5-9acc6098ec59-kube-api-access-vmfqp\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.353707 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.397649 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.411175 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.455304 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-combined-ca-bundle\") pod \"f5e01f6b-d306-41ac-9988-156063c5af7d\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.455405 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-config-data\") pod \"f5e01f6b-d306-41ac-9988-156063c5af7d\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.455695 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrw78\" (UniqueName: \"kubernetes.io/projected/f5e01f6b-d306-41ac-9988-156063c5af7d-kube-api-access-mrw78\") pod \"f5e01f6b-d306-41ac-9988-156063c5af7d\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.461492 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5e01f6b-d306-41ac-9988-156063c5af7d-kube-api-access-mrw78" (OuterVolumeSpecName: "kube-api-access-mrw78") pod "f5e01f6b-d306-41ac-9988-156063c5af7d" (UID: "f5e01f6b-d306-41ac-9988-156063c5af7d"). InnerVolumeSpecName "kube-api-access-mrw78". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.499477 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-config-data" (OuterVolumeSpecName: "config-data") pod "f5e01f6b-d306-41ac-9988-156063c5af7d" (UID: "f5e01f6b-d306-41ac-9988-156063c5af7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.513858 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5e01f6b-d306-41ac-9988-156063c5af7d" (UID: "f5e01f6b-d306-41ac-9988-156063c5af7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.560115 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrw78\" (UniqueName: \"kubernetes.io/projected/f5e01f6b-d306-41ac-9988-156063c5af7d-kube-api-access-mrw78\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.560168 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.560179 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.065036 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e32c3573-4acb-4d70-aa6e-2d647c108931","Type":"ContainerStarted","Data":"edef59ffff3d0873180602792810e3097a50cc4082ff788c05564c40ecb2297b"} Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.065436 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e32c3573-4acb-4d70-aa6e-2d647c108931","Type":"ContainerStarted","Data":"bc44e17aecd79eaa2526351b746c2930a0882074242ccf381e0b8d197a5ac152"} Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.065450 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e32c3573-4acb-4d70-aa6e-2d647c108931","Type":"ContainerStarted","Data":"016c4f8330bcc7bd961914e21614eb5a9ff7ba7a7613602e0f9713edabc22f78"} Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.067530 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5e01f6b-d306-41ac-9988-156063c5af7d","Type":"ContainerDied","Data":"9fd55676c924089cae058f2bc5bdb6090578b3476a36c5a733237f94e45c9618"} Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.067565 4781 scope.go:117] "RemoveContainer" containerID="8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.067621 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.070169 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecbcf96-0260-4e87-afe5-9acc6098ec59","Type":"ContainerDied","Data":"b6418bc15cd13e8d0bd6da20db363f8f00d774a0d59f04ab4104216089ac4cbd"} Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.070247 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.109159 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.109135545 podStartE2EDuration="2.109135545s" podCreationTimestamp="2026-02-27 00:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:30:16.086869793 +0000 UTC m=+1485.344409367" watchObservedRunningTime="2026-02-27 00:30:16.109135545 +0000 UTC m=+1485.366675099" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.109235 4781 scope.go:117] "RemoveContainer" containerID="4fb1191f2a33c534b5ff081b75c12582b7d083d0914703fd100fe77a443ffaca" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.143707 4781 scope.go:117] "RemoveContainer" containerID="a1f456672f54264ca1c600a8f933a178f9e17b564318587aad61468738ce8718" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.144674 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.175751 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.200221 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.216925 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.225984 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: E0227 00:30:16.226436 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-api" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.226454 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-api" Feb 27 00:30:16 crc kubenswrapper[4781]: E0227 00:30:16.226465 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5e01f6b-d306-41ac-9988-156063c5af7d" containerName="nova-scheduler-scheduler" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.226471 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5e01f6b-d306-41ac-9988-156063c5af7d" containerName="nova-scheduler-scheduler" Feb 27 00:30:16 crc kubenswrapper[4781]: E0227 00:30:16.226484 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-log" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.226492 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-log" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.226734 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-api" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.226759 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-log" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.226774 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5e01f6b-d306-41ac-9988-156063c5af7d" containerName="nova-scheduler-scheduler" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.227826 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.230233 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.230364 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.230747 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.273763 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.280754 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.290225 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.292312 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e258c11-5caa-4d6b-ab77-841ddf83ac81-logs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.292472 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.292753 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv52c\" (UniqueName: \"kubernetes.io/projected/4e258c11-5caa-4d6b-ab77-841ddf83ac81-kube-api-access-zv52c\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.292869 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-public-tls-certs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.293403 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-config-data\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.293766 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.296813 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.305302 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.395444 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-public-tls-certs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.395526 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqc6w\" (UniqueName: \"kubernetes.io/projected/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-kube-api-access-hqc6w\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.395708 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-config-data\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.395800 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-config-data\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.395956 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.396000 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.396058 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e258c11-5caa-4d6b-ab77-841ddf83ac81-logs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.396113 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.396148 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv52c\" (UniqueName: \"kubernetes.io/projected/4e258c11-5caa-4d6b-ab77-841ddf83ac81-kube-api-access-zv52c\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.397668 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e258c11-5caa-4d6b-ab77-841ddf83ac81-logs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.400529 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-public-tls-certs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.401104 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-config-data\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.401139 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.401226 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.422101 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv52c\" (UniqueName: \"kubernetes.io/projected/4e258c11-5caa-4d6b-ab77-841ddf83ac81-kube-api-access-zv52c\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.497994 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-config-data\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.498105 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.498174 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqc6w\" (UniqueName: \"kubernetes.io/projected/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-kube-api-access-hqc6w\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.501854 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-config-data\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.502614 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.514751 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqc6w\" (UniqueName: \"kubernetes.io/projected/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-kube-api-access-hqc6w\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.557383 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.623887 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:30:17 crc kubenswrapper[4781]: I0227 00:30:17.056471 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:30:17 crc kubenswrapper[4781]: W0227 00:30:17.059077 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e258c11_5caa_4d6b_ab77_841ddf83ac81.slice/crio-d68ab3f956ceb1f76180523b1da9ac1e6a71f2faf4fcadcc14a72c8376f2f86d WatchSource:0}: Error finding container d68ab3f956ceb1f76180523b1da9ac1e6a71f2faf4fcadcc14a72c8376f2f86d: Status 404 returned error can't find the container with id d68ab3f956ceb1f76180523b1da9ac1e6a71f2faf4fcadcc14a72c8376f2f86d Feb 27 00:30:17 crc kubenswrapper[4781]: I0227 00:30:17.081032 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4e258c11-5caa-4d6b-ab77-841ddf83ac81","Type":"ContainerStarted","Data":"d68ab3f956ceb1f76180523b1da9ac1e6a71f2faf4fcadcc14a72c8376f2f86d"} Feb 27 00:30:17 crc kubenswrapper[4781]: W0227 00:30:17.176470 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d7f8c00_d318_4f7d_b67e_6743c3a82dae.slice/crio-4aaddd97e770ae5123af55ed21c62a17a6d5ebf9815ca468e08d021529fa6778 WatchSource:0}: Error finding container 4aaddd97e770ae5123af55ed21c62a17a6d5ebf9815ca468e08d021529fa6778: Status 404 returned error can't find the container with id 4aaddd97e770ae5123af55ed21c62a17a6d5ebf9815ca468e08d021529fa6778 Feb 27 00:30:17 crc kubenswrapper[4781]: I0227 00:30:17.180907 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:30:17 crc kubenswrapper[4781]: I0227 00:30:17.325061 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" path="/var/lib/kubelet/pods/0ecbcf96-0260-4e87-afe5-9acc6098ec59/volumes" Feb 27 00:30:17 crc kubenswrapper[4781]: I0227 00:30:17.328883 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5e01f6b-d306-41ac-9988-156063c5af7d" path="/var/lib/kubelet/pods/f5e01f6b-d306-41ac-9988-156063c5af7d/volumes" Feb 27 00:30:18 crc kubenswrapper[4781]: I0227 00:30:18.098138 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4e258c11-5caa-4d6b-ab77-841ddf83ac81","Type":"ContainerStarted","Data":"700884efc89dd85a512e20845640f310b33900ef785ea60c53d7aa26f85af38d"} Feb 27 00:30:18 crc kubenswrapper[4781]: I0227 00:30:18.098519 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4e258c11-5caa-4d6b-ab77-841ddf83ac81","Type":"ContainerStarted","Data":"06c91865ba81c14d409c91073f4ed9bfdd0ca7faac32ba2130c7a442b7dc699c"} Feb 27 00:30:18 crc kubenswrapper[4781]: I0227 00:30:18.100471 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1d7f8c00-d318-4f7d-b67e-6743c3a82dae","Type":"ContainerStarted","Data":"eb3d1f00e242dd2b542c92285321576cfdd7b7c9a7730699acee14eb1633cb42"} Feb 27 00:30:18 crc kubenswrapper[4781]: I0227 00:30:18.100530 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1d7f8c00-d318-4f7d-b67e-6743c3a82dae","Type":"ContainerStarted","Data":"4aaddd97e770ae5123af55ed21c62a17a6d5ebf9815ca468e08d021529fa6778"} Feb 27 00:30:18 crc kubenswrapper[4781]: I0227 00:30:18.126600 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.126558997 podStartE2EDuration="2.126558997s" podCreationTimestamp="2026-02-27 00:30:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:30:18.120141394 +0000 UTC m=+1487.377680948" watchObservedRunningTime="2026-02-27 00:30:18.126558997 +0000 UTC m=+1487.384098551" Feb 27 00:30:18 crc kubenswrapper[4781]: I0227 00:30:18.149148 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.149130117 podStartE2EDuration="2.149130117s" podCreationTimestamp="2026-02-27 00:30:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:30:18.144297037 +0000 UTC m=+1487.401836611" watchObservedRunningTime="2026-02-27 00:30:18.149130117 +0000 UTC m=+1487.406669671" Feb 27 00:30:19 crc kubenswrapper[4781]: I0227 00:30:19.716927 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 00:30:19 crc kubenswrapper[4781]: I0227 00:30:19.717255 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 00:30:20 crc kubenswrapper[4781]: I0227 00:30:20.065037 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-49thr" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" probeResult="failure" output=< Feb 27 00:30:20 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:30:20 crc kubenswrapper[4781]: > Feb 27 00:30:21 crc kubenswrapper[4781]: I0227 00:30:21.624149 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 27 00:30:24 crc kubenswrapper[4781]: I0227 00:30:24.716796 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 00:30:24 crc kubenswrapper[4781]: I0227 00:30:24.717255 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 00:30:25 crc kubenswrapper[4781]: I0227 00:30:25.732856 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e32c3573-4acb-4d70-aa6e-2d647c108931" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.238:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:30:25 crc kubenswrapper[4781]: I0227 00:30:25.732892 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e32c3573-4acb-4d70-aa6e-2d647c108931" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.238:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:30:26 crc kubenswrapper[4781]: I0227 00:30:26.543082 4781 scope.go:117] "RemoveContainer" containerID="e7c34540c9407121a9ee96d4e0537e4a13bd65448411272b9cedd072273699e8" Feb 27 00:30:26 crc kubenswrapper[4781]: I0227 00:30:26.558342 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:30:26 crc kubenswrapper[4781]: I0227 00:30:26.558401 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:30:26 crc kubenswrapper[4781]: I0227 00:30:26.625161 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 27 00:30:26 crc kubenswrapper[4781]: I0227 00:30:26.674282 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 27 00:30:27 crc kubenswrapper[4781]: I0227 00:30:27.284174 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 27 00:30:27 crc kubenswrapper[4781]: I0227 00:30:27.574771 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4e258c11-5caa-4d6b-ab77-841ddf83ac81" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.239:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:30:27 crc kubenswrapper[4781]: I0227 00:30:27.574783 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4e258c11-5caa-4d6b-ab77-841ddf83ac81" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.239:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:30:28 crc kubenswrapper[4781]: I0227 00:30:28.115080 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 27 00:30:30 crc kubenswrapper[4781]: I0227 00:30:30.079337 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-49thr" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" probeResult="failure" output=< Feb 27 00:30:30 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:30:30 crc kubenswrapper[4781]: > Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.286192 4781 generic.go:334] "Generic (PLEG): container finished" podID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerID="12384cd6d81bca6e23791de7f51204e92fef8abc9d7b3fd6d25b23d6563fe9c1" exitCode=137 Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.286739 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerDied","Data":"12384cd6d81bca6e23791de7f51204e92fef8abc9d7b3fd6d25b23d6563fe9c1"} Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.371388 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447120 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-run-httpd\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447187 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-sg-core-conf-yaml\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447264 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-log-httpd\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447287 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-config-data\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447322 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-scripts\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447372 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5spl\" (UniqueName: \"kubernetes.io/projected/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-kube-api-access-b5spl\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447502 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-combined-ca-bundle\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.448445 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.449201 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.453827 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-kube-api-access-b5spl" (OuterVolumeSpecName: "kube-api-access-b5spl") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "kube-api-access-b5spl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.455846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-scripts" (OuterVolumeSpecName: "scripts") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.489554 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.530247 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.550285 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.550324 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.550336 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.550347 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.550360 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.550371 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5spl\" (UniqueName: \"kubernetes.io/projected/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-kube-api-access-b5spl\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.581581 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-config-data" (OuterVolumeSpecName: "config-data") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.652304 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.723926 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.725683 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.732487 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.297581 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerDied","Data":"2c4686839ed88b8a45c07bbc45e5d7e8f95577bd88f8f5ed02b133c4326a106e"} Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.297650 4781 scope.go:117] "RemoveContainer" containerID="12384cd6d81bca6e23791de7f51204e92fef8abc9d7b3fd6d25b23d6563fe9c1" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.298412 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.305989 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.320238 4781 scope.go:117] "RemoveContainer" containerID="fe5f026af9d91c7231233bd71a19d73aa9a872a6aadf224c6825ac237ba54fe7" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.344016 4781 scope.go:117] "RemoveContainer" containerID="7b717e2de43421aa8594d897ad2625a4ab8b03819b1fddaf68c17d412b495f36" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.377674 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.385692 4781 scope.go:117] "RemoveContainer" containerID="6aee62b4a0e99b93b70f6483a48731232b6e0bac4ac727deafa8d7b77d1fa9e5" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.388774 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.463918 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:30:35 crc kubenswrapper[4781]: E0227 00:30:35.464418 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-notification-agent" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464439 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-notification-agent" Feb 27 00:30:35 crc kubenswrapper[4781]: E0227 00:30:35.464476 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="sg-core" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464483 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="sg-core" Feb 27 00:30:35 crc kubenswrapper[4781]: E0227 00:30:35.464497 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-central-agent" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464537 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-central-agent" Feb 27 00:30:35 crc kubenswrapper[4781]: E0227 00:30:35.464549 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="proxy-httpd" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464556 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="proxy-httpd" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464750 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-central-agent" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464766 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="sg-core" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464783 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="proxy-httpd" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464806 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-notification-agent" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.467962 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.470782 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.471038 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.471366 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.494854 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574008 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-log-httpd\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574584 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-scripts\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574649 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njzlk\" (UniqueName: \"kubernetes.io/projected/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-kube-api-access-njzlk\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574681 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574745 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-run-httpd\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574779 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574998 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-config-data\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.575052 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677378 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677476 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-config-data\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677498 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677549 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-log-httpd\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677577 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-scripts\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677606 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njzlk\" (UniqueName: \"kubernetes.io/projected/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-kube-api-access-njzlk\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677639 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677687 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-run-httpd\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.678123 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-run-httpd\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.679816 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-log-httpd\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.683694 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.684707 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.684825 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-config-data\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.685725 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-scripts\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.703549 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.704006 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njzlk\" (UniqueName: \"kubernetes.io/projected/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-kube-api-access-njzlk\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.786442 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:30:36 crc kubenswrapper[4781]: I0227 00:30:36.287318 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:30:36 crc kubenswrapper[4781]: I0227 00:30:36.309225 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerStarted","Data":"94c7511afd7913c3a074803cf7f0cdd498d1a4a5ebb3ef1f330c0237d0afa73c"} Feb 27 00:30:36 crc kubenswrapper[4781]: I0227 00:30:36.566718 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 00:30:36 crc kubenswrapper[4781]: I0227 00:30:36.568191 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 00:30:36 crc kubenswrapper[4781]: I0227 00:30:36.569688 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 00:30:36 crc kubenswrapper[4781]: I0227 00:30:36.578169 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 00:30:37 crc kubenswrapper[4781]: I0227 00:30:37.321162 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" path="/var/lib/kubelet/pods/93fc1751-3b02-4d9a-8278-caa0f09f8e9e/volumes" Feb 27 00:30:37 crc kubenswrapper[4781]: I0227 00:30:37.324221 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerStarted","Data":"9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51"} Feb 27 00:30:37 crc kubenswrapper[4781]: I0227 00:30:37.324892 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 00:30:37 crc kubenswrapper[4781]: I0227 00:30:37.331225 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 00:30:38 crc kubenswrapper[4781]: I0227 00:30:38.336347 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerStarted","Data":"b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5"} Feb 27 00:30:39 crc kubenswrapper[4781]: I0227 00:30:39.059933 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:30:39 crc kubenswrapper[4781]: I0227 00:30:39.119967 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:30:39 crc kubenswrapper[4781]: I0227 00:30:39.301432 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-49thr"] Feb 27 00:30:39 crc kubenswrapper[4781]: I0227 00:30:39.352677 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerStarted","Data":"1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f"} Feb 27 00:30:40 crc kubenswrapper[4781]: I0227 00:30:40.363898 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-49thr" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" containerID="cri-o://d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344" gracePeriod=2 Feb 27 00:30:40 crc kubenswrapper[4781]: I0227 00:30:40.885237 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:30:40 crc kubenswrapper[4781]: I0227 00:30:40.994925 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwm7q\" (UniqueName: \"kubernetes.io/projected/cc2a000c-f169-4622-8c82-cd4c2baa730a-kube-api-access-dwm7q\") pod \"cc2a000c-f169-4622-8c82-cd4c2baa730a\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " Feb 27 00:30:40 crc kubenswrapper[4781]: I0227 00:30:40.995003 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-utilities\") pod \"cc2a000c-f169-4622-8c82-cd4c2baa730a\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " Feb 27 00:30:40 crc kubenswrapper[4781]: I0227 00:30:40.995051 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-catalog-content\") pod \"cc2a000c-f169-4622-8c82-cd4c2baa730a\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " Feb 27 00:30:40 crc kubenswrapper[4781]: I0227 00:30:40.995727 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-utilities" (OuterVolumeSpecName: "utilities") pod "cc2a000c-f169-4622-8c82-cd4c2baa730a" (UID: "cc2a000c-f169-4622-8c82-cd4c2baa730a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.000163 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc2a000c-f169-4622-8c82-cd4c2baa730a-kube-api-access-dwm7q" (OuterVolumeSpecName: "kube-api-access-dwm7q") pod "cc2a000c-f169-4622-8c82-cd4c2baa730a" (UID: "cc2a000c-f169-4622-8c82-cd4c2baa730a"). InnerVolumeSpecName "kube-api-access-dwm7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.097330 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwm7q\" (UniqueName: \"kubernetes.io/projected/cc2a000c-f169-4622-8c82-cd4c2baa730a-kube-api-access-dwm7q\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.097358 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.112012 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc2a000c-f169-4622-8c82-cd4c2baa730a" (UID: "cc2a000c-f169-4622-8c82-cd4c2baa730a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.199716 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.375980 4781 generic.go:334] "Generic (PLEG): container finished" podID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerID="d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344" exitCode=0 Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.376015 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.376021 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerDied","Data":"d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344"} Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.376048 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerDied","Data":"91f3015c87e164a4f8b695fde0fa0251c13f94e03e4150c65b9ff72adaa25616"} Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.376066 4781 scope.go:117] "RemoveContainer" containerID="d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.403447 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-49thr"] Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.409196 4781 scope.go:117] "RemoveContainer" containerID="4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.416405 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-49thr"] Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.446080 4781 scope.go:117] "RemoveContainer" containerID="a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.495402 4781 scope.go:117] "RemoveContainer" containerID="d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344" Feb 27 00:30:41 crc kubenswrapper[4781]: E0227 00:30:41.495904 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344\": container with ID starting with d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344 not found: ID does not exist" containerID="d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.495937 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344"} err="failed to get container status \"d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344\": rpc error: code = NotFound desc = could not find container \"d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344\": container with ID starting with d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344 not found: ID does not exist" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.495961 4781 scope.go:117] "RemoveContainer" containerID="4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0" Feb 27 00:30:41 crc kubenswrapper[4781]: E0227 00:30:41.496367 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0\": container with ID starting with 4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0 not found: ID does not exist" containerID="4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.496404 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0"} err="failed to get container status \"4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0\": rpc error: code = NotFound desc = could not find container \"4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0\": container with ID starting with 4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0 not found: ID does not exist" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.496433 4781 scope.go:117] "RemoveContainer" containerID="a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a" Feb 27 00:30:41 crc kubenswrapper[4781]: E0227 00:30:41.496847 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a\": container with ID starting with a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a not found: ID does not exist" containerID="a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.496869 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a"} err="failed to get container status \"a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a\": rpc error: code = NotFound desc = could not find container \"a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a\": container with ID starting with a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a not found: ID does not exist" Feb 27 00:30:42 crc kubenswrapper[4781]: I0227 00:30:42.895111 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:30:42 crc kubenswrapper[4781]: I0227 00:30:42.895182 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:30:43 crc kubenswrapper[4781]: I0227 00:30:43.332524 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" path="/var/lib/kubelet/pods/cc2a000c-f169-4622-8c82-cd4c2baa730a/volumes" Feb 27 00:30:45 crc kubenswrapper[4781]: I0227 00:30:45.420247 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerStarted","Data":"11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4"} Feb 27 00:30:45 crc kubenswrapper[4781]: I0227 00:30:45.420815 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:30:45 crc kubenswrapper[4781]: I0227 00:30:45.444015 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.89654247 podStartE2EDuration="10.4439945s" podCreationTimestamp="2026-02-27 00:30:35 +0000 UTC" firstStartedPulling="2026-02-27 00:30:36.283239641 +0000 UTC m=+1505.540779195" lastFinishedPulling="2026-02-27 00:30:44.830691671 +0000 UTC m=+1514.088231225" observedRunningTime="2026-02-27 00:30:45.44396752 +0000 UTC m=+1514.701507084" watchObservedRunningTime="2026-02-27 00:30:45.4439945 +0000 UTC m=+1514.701534064" Feb 27 00:31:05 crc kubenswrapper[4781]: I0227 00:31:05.815056 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 27 00:31:12 crc kubenswrapper[4781]: I0227 00:31:12.895307 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:31:12 crc kubenswrapper[4781]: I0227 00:31:12.895983 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:31:12 crc kubenswrapper[4781]: I0227 00:31:12.896052 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:31:12 crc kubenswrapper[4781]: I0227 00:31:12.897194 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18f81d6f38ae3802e83160171263bed0ca095345d87ab2807429711c0c761818"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:31:12 crc kubenswrapper[4781]: I0227 00:31:12.897310 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://18f81d6f38ae3802e83160171263bed0ca095345d87ab2807429711c0c761818" gracePeriod=600 Feb 27 00:31:13 crc kubenswrapper[4781]: I0227 00:31:13.745587 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="18f81d6f38ae3802e83160171263bed0ca095345d87ab2807429711c0c761818" exitCode=0 Feb 27 00:31:13 crc kubenswrapper[4781]: I0227 00:31:13.745741 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"18f81d6f38ae3802e83160171263bed0ca095345d87ab2807429711c0c761818"} Feb 27 00:31:13 crc kubenswrapper[4781]: I0227 00:31:13.745956 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f"} Feb 27 00:31:13 crc kubenswrapper[4781]: I0227 00:31:13.745980 4781 scope.go:117] "RemoveContainer" containerID="40924ce0e5e04646329cd01d3e3dc65fdaf6b21bdd01704d3fa5ed81c86443f6" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.795766 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-l9w6z"] Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.806889 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-l9w6z"] Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.892379 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-6wz7g"] Feb 27 00:31:16 crc kubenswrapper[4781]: E0227 00:31:16.892955 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="extract-utilities" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.892982 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="extract-utilities" Feb 27 00:31:16 crc kubenswrapper[4781]: E0227 00:31:16.893012 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.893021 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" Feb 27 00:31:16 crc kubenswrapper[4781]: E0227 00:31:16.893038 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="extract-content" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.893047 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="extract-content" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.893321 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.894271 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.896585 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.921562 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-6wz7g"] Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.959247 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-certs\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.959310 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-scripts\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.959357 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-config-data\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.959386 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2tjz\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-kube-api-access-w2tjz\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.959411 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-combined-ca-bundle\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.061150 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-certs\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.061213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-scripts\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.061266 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-config-data\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.061302 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2tjz\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-kube-api-access-w2tjz\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.061335 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-combined-ca-bundle\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.068702 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-certs\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.068962 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-config-data\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.069280 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-scripts\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.070045 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-combined-ca-bundle\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.080199 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2tjz\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-kube-api-access-w2tjz\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.239374 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.322985 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2274af64-0743-4ede-8fb8-e2ed801638ac" path="/var/lib/kubelet/pods/2274af64-0743-4ede-8fb8-e2ed801638ac/volumes" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.760471 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-6wz7g"] Feb 27 00:31:17 crc kubenswrapper[4781]: W0227 00:31:17.775394 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb669382e_dffc_421d_80a3_82b928f54044.slice/crio-bf0ab0e9643525093c3faf04cca90f385efba4046632e8a89c2d0d13c194fa6e WatchSource:0}: Error finding container bf0ab0e9643525093c3faf04cca90f385efba4046632e8a89c2d0d13c194fa6e: Status 404 returned error can't find the container with id bf0ab0e9643525093c3faf04cca90f385efba4046632e8a89c2d0d13c194fa6e Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.799688 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6wz7g" event={"ID":"b669382e-dffc-421d-80a3-82b928f54044","Type":"ContainerStarted","Data":"bf0ab0e9643525093c3faf04cca90f385efba4046632e8a89c2d0d13c194fa6e"} Feb 27 00:31:18 crc kubenswrapper[4781]: I0227 00:31:18.808869 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6wz7g" event={"ID":"b669382e-dffc-421d-80a3-82b928f54044","Type":"ContainerStarted","Data":"08009d33d7dd60364f173703aa207fb7fe65cb10f22855e575d2a1e3d49e40a0"} Feb 27 00:31:18 crc kubenswrapper[4781]: I0227 00:31:18.842126 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-6wz7g" podStartSLOduration=2.690077798 podStartE2EDuration="2.842091685s" podCreationTimestamp="2026-02-27 00:31:16 +0000 UTC" firstStartedPulling="2026-02-27 00:31:17.780142785 +0000 UTC m=+1547.037682339" lastFinishedPulling="2026-02-27 00:31:17.932156672 +0000 UTC m=+1547.189696226" observedRunningTime="2026-02-27 00:31:18.826577576 +0000 UTC m=+1548.084117130" watchObservedRunningTime="2026-02-27 00:31:18.842091685 +0000 UTC m=+1548.099631229" Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.109087 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.159765 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.160034 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-central-agent" containerID="cri-o://9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51" gracePeriod=30 Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.160080 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="proxy-httpd" containerID="cri-o://11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4" gracePeriod=30 Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.160139 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-notification-agent" containerID="cri-o://b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5" gracePeriod=30 Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.160304 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="sg-core" containerID="cri-o://1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f" gracePeriod=30 Feb 27 00:31:19 crc kubenswrapper[4781]: E0227 00:31:19.769774 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod277c2e9c_3c87_442a_b5f6_52f1d63c24e9.slice/crio-9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod277c2e9c_3c87_442a_b5f6_52f1d63c24e9.slice/crio-conmon-9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.832357 4781 generic.go:334] "Generic (PLEG): container finished" podID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerID="11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4" exitCode=0 Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.832394 4781 generic.go:334] "Generic (PLEG): container finished" podID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerID="1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f" exitCode=2 Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.832402 4781 generic.go:334] "Generic (PLEG): container finished" podID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerID="9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51" exitCode=0 Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.832539 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerDied","Data":"11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4"} Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.832593 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerDied","Data":"1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f"} Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.832605 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerDied","Data":"9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51"} Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.129517 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.737653 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.842844 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-log-httpd\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.842942 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njzlk\" (UniqueName: \"kubernetes.io/projected/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-kube-api-access-njzlk\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.842984 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-config-data\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.843018 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-run-httpd\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.843068 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-combined-ca-bundle\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.843142 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-scripts\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.843183 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-ceilometer-tls-certs\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.843293 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-sg-core-conf-yaml\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.846559 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.846887 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.873134 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-scripts" (OuterVolumeSpecName: "scripts") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.873193 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-kube-api-access-njzlk" (OuterVolumeSpecName: "kube-api-access-njzlk") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "kube-api-access-njzlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.885923 4781 generic.go:334] "Generic (PLEG): container finished" podID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerID="b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5" exitCode=0 Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.885984 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerDied","Data":"b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5"} Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.886010 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerDied","Data":"94c7511afd7913c3a074803cf7f0cdd498d1a4a5ebb3ef1f330c0237d0afa73c"} Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.886026 4781 scope.go:117] "RemoveContainer" containerID="11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.886197 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.887852 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.945054 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.946275 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.946321 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njzlk\" (UniqueName: \"kubernetes.io/projected/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-kube-api-access-njzlk\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.946333 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.946344 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.946356 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.946367 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.017302 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.020519 4781 scope.go:117] "RemoveContainer" containerID="1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.054225 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.055455 4781 scope.go:117] "RemoveContainer" containerID="b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.079755 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-config-data" (OuterVolumeSpecName: "config-data") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.091684 4781 scope.go:117] "RemoveContainer" containerID="9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.132821 4781 scope.go:117] "RemoveContainer" containerID="11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.137059 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4\": container with ID starting with 11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4 not found: ID does not exist" containerID="11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.137109 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4"} err="failed to get container status \"11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4\": rpc error: code = NotFound desc = could not find container \"11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4\": container with ID starting with 11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4 not found: ID does not exist" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.137134 4781 scope.go:117] "RemoveContainer" containerID="1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.137802 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f\": container with ID starting with 1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f not found: ID does not exist" containerID="1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.137911 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f"} err="failed to get container status \"1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f\": rpc error: code = NotFound desc = could not find container \"1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f\": container with ID starting with 1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f not found: ID does not exist" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.138003 4781 scope.go:117] "RemoveContainer" containerID="b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.142146 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5\": container with ID starting with b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5 not found: ID does not exist" containerID="b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.142247 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5"} err="failed to get container status \"b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5\": rpc error: code = NotFound desc = could not find container \"b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5\": container with ID starting with b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5 not found: ID does not exist" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.142311 4781 scope.go:117] "RemoveContainer" containerID="9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.142579 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51\": container with ID starting with 9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51 not found: ID does not exist" containerID="9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.142674 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51"} err="failed to get container status \"9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51\": rpc error: code = NotFound desc = could not find container \"9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51\": container with ID starting with 9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51 not found: ID does not exist" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.159289 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.227783 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.244304 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.288559 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.288991 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="proxy-httpd" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289008 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="proxy-httpd" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.289026 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="sg-core" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289032 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="sg-core" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.289042 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-notification-agent" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289051 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-notification-agent" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.289069 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-central-agent" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289074 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-central-agent" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289260 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-central-agent" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289276 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-notification-agent" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289286 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="sg-core" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289297 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="proxy-httpd" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.291074 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.297325 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.297724 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.298191 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.322608 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" path="/var/lib/kubelet/pods/277c2e9c-3c87-442a-b5f6-52f1d63c24e9/volumes" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.323520 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362521 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362589 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362685 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-scripts\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362756 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-config-data\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362777 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-log-httpd\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362812 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g8mz\" (UniqueName: \"kubernetes.io/projected/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-kube-api-access-8g8mz\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362854 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-run-httpd\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362900 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.464420 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g8mz\" (UniqueName: \"kubernetes.io/projected/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-kube-api-access-8g8mz\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.464797 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-run-httpd\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.464834 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.464918 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.464941 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.464987 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-scripts\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.465016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-config-data\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.465038 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-log-httpd\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.465310 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-run-httpd\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.465472 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-log-httpd\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.471345 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.474195 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.475183 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-scripts\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.476193 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.482058 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-config-data\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.491437 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g8mz\" (UniqueName: \"kubernetes.io/projected/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-kube-api-access-8g8mz\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.609124 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.907145 4781 generic.go:334] "Generic (PLEG): container finished" podID="b669382e-dffc-421d-80a3-82b928f54044" containerID="08009d33d7dd60364f173703aa207fb7fe65cb10f22855e575d2a1e3d49e40a0" exitCode=0 Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.907219 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6wz7g" event={"ID":"b669382e-dffc-421d-80a3-82b928f54044","Type":"ContainerDied","Data":"08009d33d7dd60364f173703aa207fb7fe65cb10f22855e575d2a1e3d49e40a0"} Feb 27 00:31:22 crc kubenswrapper[4781]: I0227 00:31:22.179740 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:31:22 crc kubenswrapper[4781]: I0227 00:31:22.953505 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f5736d7-ab3f-41d9-b5ec-94da30e708f1","Type":"ContainerStarted","Data":"01d7a9064d9fb0090af236b1310f46844f7df89bee45fc33b93891145fc80815"} Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.473342 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.616117 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-scripts\") pod \"b669382e-dffc-421d-80a3-82b928f54044\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.616215 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-combined-ca-bundle\") pod \"b669382e-dffc-421d-80a3-82b928f54044\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.616282 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-config-data\") pod \"b669382e-dffc-421d-80a3-82b928f54044\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.616389 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-certs\") pod \"b669382e-dffc-421d-80a3-82b928f54044\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.616418 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2tjz\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-kube-api-access-w2tjz\") pod \"b669382e-dffc-421d-80a3-82b928f54044\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.623498 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-scripts" (OuterVolumeSpecName: "scripts") pod "b669382e-dffc-421d-80a3-82b928f54044" (UID: "b669382e-dffc-421d-80a3-82b928f54044"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.623821 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-certs" (OuterVolumeSpecName: "certs") pod "b669382e-dffc-421d-80a3-82b928f54044" (UID: "b669382e-dffc-421d-80a3-82b928f54044"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.639501 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-kube-api-access-w2tjz" (OuterVolumeSpecName: "kube-api-access-w2tjz") pod "b669382e-dffc-421d-80a3-82b928f54044" (UID: "b669382e-dffc-421d-80a3-82b928f54044"). InnerVolumeSpecName "kube-api-access-w2tjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.669139 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b669382e-dffc-421d-80a3-82b928f54044" (UID: "b669382e-dffc-421d-80a3-82b928f54044"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.690105 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-config-data" (OuterVolumeSpecName: "config-data") pod "b669382e-dffc-421d-80a3-82b928f54044" (UID: "b669382e-dffc-421d-80a3-82b928f54044"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.721812 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.721854 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2tjz\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-kube-api-access-w2tjz\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.721868 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.721879 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.721890 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.966611 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6wz7g" event={"ID":"b669382e-dffc-421d-80a3-82b928f54044","Type":"ContainerDied","Data":"bf0ab0e9643525093c3faf04cca90f385efba4046632e8a89c2d0d13c194fa6e"} Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.966662 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf0ab0e9643525093c3faf04cca90f385efba4046632e8a89c2d0d13c194fa6e" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.966692 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.211213 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="rabbitmq" containerID="cri-o://dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f" gracePeriod=604794 Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.236775 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-c5vn9"] Feb 27 00:31:25 crc kubenswrapper[4781]: E0227 00:31:25.239061 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b669382e-dffc-421d-80a3-82b928f54044" containerName="cloudkitty-db-sync" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.239091 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b669382e-dffc-421d-80a3-82b928f54044" containerName="cloudkitty-db-sync" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.239306 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b669382e-dffc-421d-80a3-82b928f54044" containerName="cloudkitty-db-sync" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.240009 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.242292 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.269292 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-c5vn9"] Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.273521 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-scripts\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.273615 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-certs\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.273654 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-combined-ca-bundle\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.273703 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpvfl\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-kube-api-access-cpvfl\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.273770 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-config-data\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.281657 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-g672n"] Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.290129 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-g672n"] Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.321857 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87b3198c-30ab-415a-b24b-b26ab3da838e" path="/var/lib/kubelet/pods/87b3198c-30ab-415a-b24b-b26ab3da838e/volumes" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.375695 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-config-data\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.375836 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-scripts\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.375909 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-certs\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.375932 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-combined-ca-bundle\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.375989 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpvfl\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-kube-api-access-cpvfl\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.385527 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-combined-ca-bundle\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.387838 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-certs\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.389706 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-config-data\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.400171 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpvfl\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-kube-api-access-cpvfl\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.411158 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-scripts\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.570574 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.905941 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="rabbitmq" containerID="cri-o://84e4c6c19d757fd81ef5f856104b51d9057ffe90f91b0313f39e58f7d670a984" gracePeriod=604795 Feb 27 00:31:26 crc kubenswrapper[4781]: I0227 00:31:26.351925 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:31:26 crc kubenswrapper[4781]: I0227 00:31:26.600484 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-c5vn9"] Feb 27 00:31:27 crc kubenswrapper[4781]: I0227 00:31:27.258369 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-c5vn9" event={"ID":"fee23b33-5d55-45c9-b024-0b4865019095","Type":"ContainerStarted","Data":"4c15c466d7915dc653aadd3dff0e84b4a8fd3f49a7805b84c66c98b2891abd65"} Feb 27 00:31:27 crc kubenswrapper[4781]: I0227 00:31:27.258655 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-c5vn9" event={"ID":"fee23b33-5d55-45c9-b024-0b4865019095","Type":"ContainerStarted","Data":"c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de"} Feb 27 00:31:27 crc kubenswrapper[4781]: I0227 00:31:27.260116 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f5736d7-ab3f-41d9-b5ec-94da30e708f1","Type":"ContainerStarted","Data":"5fba32f66c88e72d5dff32f9fc4c8c9e3acbeab261897ce7904168caa209899e"} Feb 27 00:31:27 crc kubenswrapper[4781]: I0227 00:31:27.260138 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f5736d7-ab3f-41d9-b5ec-94da30e708f1","Type":"ContainerStarted","Data":"54d4957f3fe1d6d4bccbcd59c6c99d49bb1a6b6984834e6da5136a16d63d4bcd"} Feb 27 00:31:27 crc kubenswrapper[4781]: I0227 00:31:27.284559 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-c5vn9" podStartSLOduration=3.284540779 podStartE2EDuration="3.284540779s" podCreationTimestamp="2026-02-27 00:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:31:27.279075001 +0000 UTC m=+1556.536614555" watchObservedRunningTime="2026-02-27 00:31:27.284540779 +0000 UTC m=+1556.542080333" Feb 27 00:31:28 crc kubenswrapper[4781]: I0227 00:31:28.272923 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f5736d7-ab3f-41d9-b5ec-94da30e708f1","Type":"ContainerStarted","Data":"e0a664c2f58126375bd72172545af9a09ea54c63d627e55c8853f530614522f6"} Feb 27 00:31:29 crc kubenswrapper[4781]: I0227 00:31:29.285837 4781 generic.go:334] "Generic (PLEG): container finished" podID="fee23b33-5d55-45c9-b024-0b4865019095" containerID="4c15c466d7915dc653aadd3dff0e84b4a8fd3f49a7805b84c66c98b2891abd65" exitCode=0 Feb 27 00:31:29 crc kubenswrapper[4781]: I0227 00:31:29.285918 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-c5vn9" event={"ID":"fee23b33-5d55-45c9-b024-0b4865019095","Type":"ContainerDied","Data":"4c15c466d7915dc653aadd3dff0e84b4a8fd3f49a7805b84c66c98b2891abd65"} Feb 27 00:31:29 crc kubenswrapper[4781]: I0227 00:31:29.362758 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Feb 27 00:31:29 crc kubenswrapper[4781]: I0227 00:31:29.758786 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.321474 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f5736d7-ab3f-41d9-b5ec-94da30e708f1","Type":"ContainerStarted","Data":"9a972aade56daf285cc3b78040b476a15071b5264e6b20ac696006680085dedd"} Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.363556 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.788936294 podStartE2EDuration="9.363533161s" podCreationTimestamp="2026-02-27 00:31:21 +0000 UTC" firstStartedPulling="2026-02-27 00:31:22.182731597 +0000 UTC m=+1551.440271151" lastFinishedPulling="2026-02-27 00:31:29.757328464 +0000 UTC m=+1559.014868018" observedRunningTime="2026-02-27 00:31:30.347888948 +0000 UTC m=+1559.605428532" watchObservedRunningTime="2026-02-27 00:31:30.363533161 +0000 UTC m=+1559.621072725" Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.860044 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.989904 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-combined-ca-bundle\") pod \"fee23b33-5d55-45c9-b024-0b4865019095\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.989966 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-config-data\") pod \"fee23b33-5d55-45c9-b024-0b4865019095\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.989997 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-certs\") pod \"fee23b33-5d55-45c9-b024-0b4865019095\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.990061 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpvfl\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-kube-api-access-cpvfl\") pod \"fee23b33-5d55-45c9-b024-0b4865019095\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.990211 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-scripts\") pod \"fee23b33-5d55-45c9-b024-0b4865019095\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.995995 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-kube-api-access-cpvfl" (OuterVolumeSpecName: "kube-api-access-cpvfl") pod "fee23b33-5d55-45c9-b024-0b4865019095" (UID: "fee23b33-5d55-45c9-b024-0b4865019095"). InnerVolumeSpecName "kube-api-access-cpvfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.996431 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-certs" (OuterVolumeSpecName: "certs") pod "fee23b33-5d55-45c9-b024-0b4865019095" (UID: "fee23b33-5d55-45c9-b024-0b4865019095"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.010846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-scripts" (OuterVolumeSpecName: "scripts") pod "fee23b33-5d55-45c9-b024-0b4865019095" (UID: "fee23b33-5d55-45c9-b024-0b4865019095"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.020252 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-config-data" (OuterVolumeSpecName: "config-data") pod "fee23b33-5d55-45c9-b024-0b4865019095" (UID: "fee23b33-5d55-45c9-b024-0b4865019095"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.040959 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fee23b33-5d55-45c9-b024-0b4865019095" (UID: "fee23b33-5d55-45c9-b024-0b4865019095"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.093808 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.093846 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.093858 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.093887 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.093896 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpvfl\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-kube-api-access-cpvfl\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.337541 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.337999 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.338030 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-c5vn9" event={"ID":"fee23b33-5d55-45c9-b024-0b4865019095","Type":"ContainerDied","Data":"c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de"} Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.338053 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.526436 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.526684 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="db34a476-dd22-4085-bb2c-a8e57b0d9889" containerName="cloudkitty-proc" containerID="cri-o://b68ba0c8681372d6b46f2eda69405ebdb37954378e7c21495676222df54a1d3a" gracePeriod=30 Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.559879 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.560117 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api-log" containerID="cri-o://d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241" gracePeriod=30 Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.560239 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api" containerID="cri-o://e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a" gracePeriod=30 Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.205730 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.224877 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-tls\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.224957 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-plugins-conf\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.224996 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-erlang-cookie\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225029 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf9tq\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-kube-api-access-tf9tq\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225112 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-confd\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225148 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-plugins\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225179 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/919ba171-1971-416c-99c1-5dfcacc10a28-pod-info\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225206 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/919ba171-1971-416c-99c1-5dfcacc10a28-erlang-cookie-secret\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225818 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225910 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-server-conf\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225935 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-config-data\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.226486 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.226997 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.231309 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.235878 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.249846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/919ba171-1971-416c-99c1-5dfcacc10a28-pod-info" (OuterVolumeSpecName: "pod-info") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.243978 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-kube-api-access-tf9tq" (OuterVolumeSpecName: "kube-api-access-tf9tq") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "kube-api-access-tf9tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.289743 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919ba171-1971-416c-99c1-5dfcacc10a28-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.292907 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc" (OuterVolumeSpecName: "persistence") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "pvc-32c98d96-9f26-419a-9095-7dcb737794cc". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328293 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328319 4781 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328329 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328339 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf9tq\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-kube-api-access-tf9tq\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328348 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328355 4781 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/919ba171-1971-416c-99c1-5dfcacc10a28-pod-info\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328365 4781 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/919ba171-1971-416c-99c1-5dfcacc10a28-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328386 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") on node \"crc\" " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.371486 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-server-conf" (OuterVolumeSpecName: "server-conf") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.432337 4781 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-server-conf\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.433201 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-config-data" (OuterVolumeSpecName: "config-data") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.433439 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.434123 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-32c98d96-9f26-419a-9095-7dcb737794cc" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc") on node "crc" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.466970 4781 generic.go:334] "Generic (PLEG): container finished" podID="db34a476-dd22-4085-bb2c-a8e57b0d9889" containerID="b68ba0c8681372d6b46f2eda69405ebdb37954378e7c21495676222df54a1d3a" exitCode=0 Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.467421 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"db34a476-dd22-4085-bb2c-a8e57b0d9889","Type":"ContainerDied","Data":"b68ba0c8681372d6b46f2eda69405ebdb37954378e7c21495676222df54a1d3a"} Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.502988 4781 generic.go:334] "Generic (PLEG): container finished" podID="919ba171-1971-416c-99c1-5dfcacc10a28" containerID="dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f" exitCode=0 Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.503045 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"919ba171-1971-416c-99c1-5dfcacc10a28","Type":"ContainerDied","Data":"dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f"} Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.503070 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"919ba171-1971-416c-99c1-5dfcacc10a28","Type":"ContainerDied","Data":"f58e1ef93098c46c57b5e59fd849c5fcd9c3a1bc9f7c9d503b32be5e67364d02"} Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.503085 4781 scope.go:117] "RemoveContainer" containerID="dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.503189 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.531602 4781 generic.go:334] "Generic (PLEG): container finished" podID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerID="d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241" exitCode=143 Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.531707 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"75721c64-91e7-468b-8157-9f7b0f8060b0","Type":"ContainerDied","Data":"d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241"} Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.535495 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.535524 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.545875 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.545961 4781 generic.go:334] "Generic (PLEG): container finished" podID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerID="84e4c6c19d757fd81ef5f856104b51d9057ffe90f91b0313f39e58f7d670a984" exitCode=0 Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.546742 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c7ca2a9f-a42e-4d9b-89a7-f2590842f328","Type":"ContainerDied","Data":"84e4c6c19d757fd81ef5f856104b51d9057ffe90f91b0313f39e58f7d670a984"} Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.597159 4781 scope.go:117] "RemoveContainer" containerID="96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.619702 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.637038 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.668746 4781 scope.go:117] "RemoveContainer" containerID="dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f" Feb 27 00:31:32 crc kubenswrapper[4781]: E0227 00:31:32.669608 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f\": container with ID starting with dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f not found: ID does not exist" containerID="dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.669652 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f"} err="failed to get container status \"dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f\": rpc error: code = NotFound desc = could not find container \"dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f\": container with ID starting with dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f not found: ID does not exist" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.669677 4781 scope.go:117] "RemoveContainer" containerID="96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7" Feb 27 00:31:32 crc kubenswrapper[4781]: E0227 00:31:32.669861 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7\": container with ID starting with 96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7 not found: ID does not exist" containerID="96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.669876 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7"} err="failed to get container status \"96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7\": rpc error: code = NotFound desc = could not find container \"96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7\": container with ID starting with 96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7 not found: ID does not exist" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.750818 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-pod-info\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.750965 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-plugins\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.751013 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-config-data\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.751043 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnr8b\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-kube-api-access-dnr8b\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.751111 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-erlang-cookie\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.752133 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.752259 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.752486 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-server-conf\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.752532 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-plugins-conf\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.752565 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-confd\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.753023 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-erlang-cookie-secret\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.753115 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-tls\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.753993 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.766043 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-pod-info" (OuterVolumeSpecName: "pod-info") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.768051 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.768502 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.780820 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-kube-api-access-dnr8b" (OuterVolumeSpecName: "kube-api-access-dnr8b") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "kube-api-access-dnr8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.787549 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.839895 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.868162 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.868222 4781 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-pod-info\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.868235 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnr8b\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-kube-api-access-dnr8b\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.868250 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.868261 4781 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.868272 4781 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.935680 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-config-data" (OuterVolumeSpecName: "config-data") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.975038 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.016577 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.049493 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.109260 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: E0227 00:31:33.109785 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="setup-container" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.109808 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="setup-container" Feb 27 00:31:33 crc kubenswrapper[4781]: E0227 00:31:33.109816 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="rabbitmq" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.109823 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="rabbitmq" Feb 27 00:31:33 crc kubenswrapper[4781]: E0227 00:31:33.109835 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="rabbitmq" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.109842 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="rabbitmq" Feb 27 00:31:33 crc kubenswrapper[4781]: E0227 00:31:33.109857 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="setup-container" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.109863 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="setup-container" Feb 27 00:31:33 crc kubenswrapper[4781]: E0227 00:31:33.109873 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee23b33-5d55-45c9-b024-0b4865019095" containerName="cloudkitty-storageinit" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.109879 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee23b33-5d55-45c9-b024-0b4865019095" containerName="cloudkitty-storageinit" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.110063 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="rabbitmq" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.110076 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee23b33-5d55-45c9-b024-0b4865019095" containerName="cloudkitty-storageinit" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.110089 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="rabbitmq" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.112937 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122027 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jcfdg" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122222 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122368 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122469 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122562 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122706 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122804 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.132930 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.150725 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-server-conf" (OuterVolumeSpecName: "server-conf") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.152241 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb" (OuterVolumeSpecName: "persistence") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "pvc-78362793-d2f7-4c5f-943c-efd8f93773cb". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.159780 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.186248 4781 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-server-conf\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.186281 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.186316 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") on node \"crc\" " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.238286 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.238440 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-78362793-d2f7-4c5f-943c-efd8f93773cb" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb") on node "crc" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.252197 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291168 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhzw9\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-kube-api-access-qhzw9\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291239 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291296 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291327 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-config-data\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291343 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed38e2f2-b350-4abd-abe2-859c9d504aa8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291366 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291419 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291612 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291749 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.292192 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.292236 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed38e2f2-b350-4abd-abe2-859c9d504aa8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.292411 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.323110 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" path="/var/lib/kubelet/pods/919ba171-1971-416c-99c1-5dfcacc10a28/volumes" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.393524 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-combined-ca-bundle\") pod \"db34a476-dd22-4085-bb2c-a8e57b0d9889\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.393933 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwnr7\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-kube-api-access-qwnr7\") pod \"db34a476-dd22-4085-bb2c-a8e57b0d9889\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.393971 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-certs\") pod \"db34a476-dd22-4085-bb2c-a8e57b0d9889\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.394002 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data-custom\") pod \"db34a476-dd22-4085-bb2c-a8e57b0d9889\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.394066 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data\") pod \"db34a476-dd22-4085-bb2c-a8e57b0d9889\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.394088 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-scripts\") pod \"db34a476-dd22-4085-bb2c-a8e57b0d9889\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.394757 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed38e2f2-b350-4abd-abe2-859c9d504aa8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.394859 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhzw9\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-kube-api-access-qhzw9\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395318 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395363 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395394 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-config-data\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395412 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed38e2f2-b350-4abd-abe2-859c9d504aa8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395432 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395487 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395510 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395556 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395648 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.396353 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.396505 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.397037 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-config-data\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.398537 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.399934 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.400463 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a1416593fd912ec74c6e12871251980e537685bd157bf8eba211fce64d9b048a/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.400167 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.400918 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed38e2f2-b350-4abd-abe2-859c9d504aa8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.400979 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed38e2f2-b350-4abd-abe2-859c9d504aa8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.401264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-scripts" (OuterVolumeSpecName: "scripts") pod "db34a476-dd22-4085-bb2c-a8e57b0d9889" (UID: "db34a476-dd22-4085-bb2c-a8e57b0d9889"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.401373 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "db34a476-dd22-4085-bb2c-a8e57b0d9889" (UID: "db34a476-dd22-4085-bb2c-a8e57b0d9889"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.401657 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-kube-api-access-qwnr7" (OuterVolumeSpecName: "kube-api-access-qwnr7") pod "db34a476-dd22-4085-bb2c-a8e57b0d9889" (UID: "db34a476-dd22-4085-bb2c-a8e57b0d9889"). InnerVolumeSpecName "kube-api-access-qwnr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.401896 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-certs" (OuterVolumeSpecName: "certs") pod "db34a476-dd22-4085-bb2c-a8e57b0d9889" (UID: "db34a476-dd22-4085-bb2c-a8e57b0d9889"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.402372 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.416192 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.416678 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhzw9\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-kube-api-access-qhzw9\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.445564 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.445821 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data" (OuterVolumeSpecName: "config-data") pod "db34a476-dd22-4085-bb2c-a8e57b0d9889" (UID: "db34a476-dd22-4085-bb2c-a8e57b0d9889"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.454019 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db34a476-dd22-4085-bb2c-a8e57b0d9889" (UID: "db34a476-dd22-4085-bb2c-a8e57b0d9889"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.498242 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.498281 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwnr7\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-kube-api-access-qwnr7\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.498294 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.498305 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.498317 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.498329 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.548753 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.558138 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"db34a476-dd22-4085-bb2c-a8e57b0d9889","Type":"ContainerDied","Data":"3977a787677f2d7efa8ed48ba20f2b14940f8edb53792eab26f57aa6da59e0d1"} Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.558189 4781 scope.go:117] "RemoveContainer" containerID="b68ba0c8681372d6b46f2eda69405ebdb37954378e7c21495676222df54a1d3a" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.558185 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.562212 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c7ca2a9f-a42e-4d9b-89a7-f2590842f328","Type":"ContainerDied","Data":"231128a0e69346808037ae68c6b271f51f365db9ad7d7761da0f5c52d5d3f07d"} Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.562306 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.692604 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.703575 4781 scope.go:117] "RemoveContainer" containerID="84e4c6c19d757fd81ef5f856104b51d9057ffe90f91b0313f39e58f7d670a984" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.758035 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.765733 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.788018 4781 scope.go:117] "RemoveContainer" containerID="592b25e10dba92f06ec6db612c25fdc12d9afc456496a972e547225b9ac93f91" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.810662 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.822310 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: E0227 00:31:33.822802 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db34a476-dd22-4085-bb2c-a8e57b0d9889" containerName="cloudkitty-proc" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.822821 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="db34a476-dd22-4085-bb2c-a8e57b0d9889" containerName="cloudkitty-proc" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.823049 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="db34a476-dd22-4085-bb2c-a8e57b0d9889" containerName="cloudkitty-proc" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.825273 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.833348 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.858349 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.860690 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.862859 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.863110 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.863226 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.863392 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.864210 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-t6n8b" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.865681 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.866055 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.868689 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.892034 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.973293 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.014747 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.014808 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-scripts\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.014843 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.014862 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cf4c3569-6860-4c2a-8923-42e436279a11-certs\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015027 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-config-data\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015095 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015139 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37519387-1738-4500-9953-52deba3e4a85-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015231 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015354 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015484 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015527 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8nqs\" (UniqueName: \"kubernetes.io/projected/cf4c3569-6860-4c2a-8923-42e436279a11-kube-api-access-x8nqs\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015545 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37519387-1738-4500-9953-52deba3e4a85-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015563 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015621 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015672 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015694 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015711 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxmc9\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-kube-api-access-dxmc9\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117200 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptk2k\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-kube-api-access-ptk2k\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117246 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75721c64-91e7-468b-8157-9f7b0f8060b0-logs\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117279 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117346 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data-custom\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117402 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-certs\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117433 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-internal-tls-certs\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117463 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-scripts\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117545 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-public-tls-certs\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117641 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-combined-ca-bundle\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117889 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-scripts\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117928 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117948 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cf4c3569-6860-4c2a-8923-42e436279a11-certs\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118386 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-config-data\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118414 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118444 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37519387-1738-4500-9953-52deba3e4a85-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118475 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118512 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118561 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118585 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8nqs\" (UniqueName: \"kubernetes.io/projected/cf4c3569-6860-4c2a-8923-42e436279a11-kube-api-access-x8nqs\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118609 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37519387-1738-4500-9953-52deba3e4a85-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118640 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118660 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118688 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118707 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118722 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxmc9\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-kube-api-access-dxmc9\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118745 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.123997 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75721c64-91e7-468b-8157-9f7b0f8060b0-logs" (OuterVolumeSpecName: "logs") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.125944 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.126471 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.126753 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.127055 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.127269 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.128147 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.140412 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-scripts" (OuterVolumeSpecName: "scripts") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.140894 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37519387-1738-4500-9953-52deba3e4a85-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.142151 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cf4c3569-6860-4c2a-8923-42e436279a11-certs\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.142793 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-scripts\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.143516 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.143542 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a2eaf337fb87b6a71958dbd52c87dbf5c448ea95938dfd82cb1cc22a9e40efc9/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.145657 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.146433 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.147257 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-certs" (OuterVolumeSpecName: "certs") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.148225 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37519387-1738-4500-9953-52deba3e4a85-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.148551 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-config-data\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.148797 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-kube-api-access-ptk2k" (OuterVolumeSpecName: "kube-api-access-ptk2k") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "kube-api-access-ptk2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.149181 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.149255 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.152935 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxmc9\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-kube-api-access-dxmc9\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.163220 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8nqs\" (UniqueName: \"kubernetes.io/projected/cf4c3569-6860-4c2a-8923-42e436279a11-kube-api-access-x8nqs\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.215954 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.220520 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.220549 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.220558 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.220566 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.220574 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptk2k\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-kube-api-access-ptk2k\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.220585 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75721c64-91e7-468b-8157-9f7b0f8060b0-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.231811 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.250449 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.273994 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.283870 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data" (OuterVolumeSpecName: "config-data") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.300847 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.323356 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.323485 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.324194 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.453723 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.490955 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.615538 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ed38e2f2-b350-4abd-abe2-859c9d504aa8","Type":"ContainerStarted","Data":"2352a458a3fa8043406f44144b7eb6d0f2fae518c516b02b5cff94fb50ca50fa"} Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.658307 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.658597 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"75721c64-91e7-468b-8157-9f7b0f8060b0","Type":"ContainerDied","Data":"e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a"} Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.658686 4781 scope.go:117] "RemoveContainer" containerID="e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.661757 4781 generic.go:334] "Generic (PLEG): container finished" podID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerID="e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a" exitCode=0 Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.661818 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"75721c64-91e7-468b-8157-9f7b0f8060b0","Type":"ContainerDied","Data":"27f0f2f53c09daadc606bc872e1f5df520a0c8f2a01549f894ec755d7a09a157"} Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.692271 4781 scope.go:117] "RemoveContainer" containerID="d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.728944 4781 scope.go:117] "RemoveContainer" containerID="e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a" Feb 27 00:31:34 crc kubenswrapper[4781]: E0227 00:31:34.731736 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a\": container with ID starting with e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a not found: ID does not exist" containerID="e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.731769 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a"} err="failed to get container status \"e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a\": rpc error: code = NotFound desc = could not find container \"e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a\": container with ID starting with e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a not found: ID does not exist" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.731791 4781 scope.go:117] "RemoveContainer" containerID="d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241" Feb 27 00:31:34 crc kubenswrapper[4781]: E0227 00:31:34.735923 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241\": container with ID starting with d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241 not found: ID does not exist" containerID="d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.735955 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241"} err="failed to get container status \"d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241\": rpc error: code = NotFound desc = could not find container \"d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241\": container with ID starting with d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241 not found: ID does not exist" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.735998 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.755912 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.769180 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:31:34 crc kubenswrapper[4781]: E0227 00:31:34.769763 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.769782 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api" Feb 27 00:31:34 crc kubenswrapper[4781]: E0227 00:31:34.769827 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api-log" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.769833 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api-log" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.770024 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.770039 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api-log" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.771214 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.774057 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.774241 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.774340 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.789792 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.830096 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-982rb"] Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.832887 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.840899 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.851851 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852165 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7ad9523-5281-4d1c-a9d5-92982905d525-logs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852321 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9hkk\" (UniqueName: \"kubernetes.io/projected/a7ad9523-5281-4d1c-a9d5-92982905d525-kube-api-access-q9hkk\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852476 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-scripts\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852637 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852735 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852811 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a7ad9523-5281-4d1c-a9d5-92982905d525-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852876 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852944 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-config-data\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.897708 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-982rb"] Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958079 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9hkk\" (UniqueName: \"kubernetes.io/projected/a7ad9523-5281-4d1c-a9d5-92982905d525-kube-api-access-q9hkk\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958127 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958210 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-scripts\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958245 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958271 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sdc6\" (UniqueName: \"kubernetes.io/projected/f81105ac-48e2-4b90-820a-8d7758ad3b33-kube-api-access-9sdc6\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958293 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958311 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-config\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958333 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958351 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a7ad9523-5281-4d1c-a9d5-92982905d525-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958369 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958390 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958408 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-config-data\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958428 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958450 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958484 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7ad9523-5281-4d1c-a9d5-92982905d525-logs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958932 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7ad9523-5281-4d1c-a9d5-92982905d525-logs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.970254 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-scripts\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.971052 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.981208 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.991871 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-config-data\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.992252 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.992772 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a7ad9523-5281-4d1c-a9d5-92982905d525-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.995918 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.004271 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9hkk\" (UniqueName: \"kubernetes.io/projected/a7ad9523-5281-4d1c-a9d5-92982905d525-kube-api-access-q9hkk\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.062991 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.063092 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sdc6\" (UniqueName: \"kubernetes.io/projected/f81105ac-48e2-4b90-820a-8d7758ad3b33-kube-api-access-9sdc6\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.063122 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.063142 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-config\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.063181 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.063205 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.063270 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.064129 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.065490 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.065984 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.066041 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.068226 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-config\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.068435 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.071969 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.092569 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sdc6\" (UniqueName: \"kubernetes.io/projected/f81105ac-48e2-4b90-820a-8d7758ad3b33-kube-api-access-9sdc6\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.111139 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.203054 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.228731 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.323809 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" path="/var/lib/kubelet/pods/75721c64-91e7-468b-8157-9f7b0f8060b0/volumes" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.324831 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" path="/var/lib/kubelet/pods/c7ca2a9f-a42e-4d9b-89a7-f2590842f328/volumes" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.326656 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db34a476-dd22-4085-bb2c-a8e57b0d9889" path="/var/lib/kubelet/pods/db34a476-dd22-4085-bb2c-a8e57b0d9889/volumes" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.599715 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.678829 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37519387-1738-4500-9953-52deba3e4a85","Type":"ContainerStarted","Data":"7b2bc74dad8d8a36748bc47857d2093994bda1653fc9f0dd1fdc558a4806b28f"} Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.682166 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"cf4c3569-6860-4c2a-8923-42e436279a11","Type":"ContainerStarted","Data":"15f5de16a32527cfaf74ad662d3f3049ea76aad5457fd4357ff0db16b4599bf4"} Feb 27 00:31:35 crc kubenswrapper[4781]: W0227 00:31:35.706804 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7ad9523_5281_4d1c_a9d5_92982905d525.slice/crio-0c79a857e65c1b2a45875ccb98d3fdf7c4e5a3bbcc5c33cf6ee46d5709600a11 WatchSource:0}: Error finding container 0c79a857e65c1b2a45875ccb98d3fdf7c4e5a3bbcc5c33cf6ee46d5709600a11: Status 404 returned error can't find the container with id 0c79a857e65c1b2a45875ccb98d3fdf7c4e5a3bbcc5c33cf6ee46d5709600a11 Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.869239 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-982rb"] Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.699151 4781 generic.go:334] "Generic (PLEG): container finished" podID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerID="f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583" exitCode=0 Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.699262 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" event={"ID":"f81105ac-48e2-4b90-820a-8d7758ad3b33","Type":"ContainerDied","Data":"f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.699509 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" event={"ID":"f81105ac-48e2-4b90-820a-8d7758ad3b33","Type":"ContainerStarted","Data":"68785af84dc6132ad668c9748f55cbc0790b34d7e605682887df4cda02988cd2"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.717119 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ed38e2f2-b350-4abd-abe2-859c9d504aa8","Type":"ContainerStarted","Data":"bfaee7ec7de3505b4e22cf4499593dc512858eb8d4ea24469079b8a31c14c355"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.726050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"a7ad9523-5281-4d1c-a9d5-92982905d525","Type":"ContainerStarted","Data":"4db093f6d8dc270dfe390ce7dc919fc13daaac34b82c5bce418d56fe704e73aa"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.726101 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"a7ad9523-5281-4d1c-a9d5-92982905d525","Type":"ContainerStarted","Data":"d2adb80aec2419491004b0d99da92c28ca8237830402ae09e31cc85b3c9be10b"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.726117 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"a7ad9523-5281-4d1c-a9d5-92982905d525","Type":"ContainerStarted","Data":"0c79a857e65c1b2a45875ccb98d3fdf7c4e5a3bbcc5c33cf6ee46d5709600a11"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.727211 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.735273 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"cf4c3569-6860-4c2a-8923-42e436279a11","Type":"ContainerStarted","Data":"5c56aa725d72412984d893e2e1e07b2db369fa8fbd72a3dd7aab740b2a509825"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.796375 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.796355771 podStartE2EDuration="2.796355771s" podCreationTimestamp="2026-02-27 00:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:31:36.777273686 +0000 UTC m=+1566.034813260" watchObservedRunningTime="2026-02-27 00:31:36.796355771 +0000 UTC m=+1566.053895325" Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.808966 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=3.558241368 podStartE2EDuration="3.808949401s" podCreationTimestamp="2026-02-27 00:31:33 +0000 UTC" firstStartedPulling="2026-02-27 00:31:35.066097657 +0000 UTC m=+1564.323637211" lastFinishedPulling="2026-02-27 00:31:35.31680569 +0000 UTC m=+1564.574345244" observedRunningTime="2026-02-27 00:31:36.794157462 +0000 UTC m=+1566.051697016" watchObservedRunningTime="2026-02-27 00:31:36.808949401 +0000 UTC m=+1566.066488955" Feb 27 00:31:37 crc kubenswrapper[4781]: I0227 00:31:37.745894 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37519387-1738-4500-9953-52deba3e4a85","Type":"ContainerStarted","Data":"35670c776bc05f00a77eed47488c259dc1a1c6ce2969f6d1ea6d21ba78546cf9"} Feb 27 00:31:37 crc kubenswrapper[4781]: I0227 00:31:37.752556 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" event={"ID":"f81105ac-48e2-4b90-820a-8d7758ad3b33","Type":"ContainerStarted","Data":"f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21"} Feb 27 00:31:37 crc kubenswrapper[4781]: I0227 00:31:37.752596 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:37 crc kubenswrapper[4781]: I0227 00:31:37.821968 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" podStartSLOduration=3.821947659 podStartE2EDuration="3.821947659s" podCreationTimestamp="2026-02-27 00:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:31:37.806403939 +0000 UTC m=+1567.063943503" watchObservedRunningTime="2026-02-27 00:31:37.821947659 +0000 UTC m=+1567.079487213" Feb 27 00:31:40 crc kubenswrapper[4781]: E0227 00:31:40.320647 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice/crio-c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de\": RecentStats: unable to find data in memory cache]" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.205276 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.278788 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-hm24r"] Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.279057 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerName="dnsmasq-dns" containerID="cri-o://e59534a993c981832971b7ef17c8f1e9f9d24b23112b98369b8ccf0ba58923af" gracePeriod=10 Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.457950 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-9drr8"] Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.459786 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.473704 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-9drr8"] Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499216 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499280 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499318 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-dns-svc\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499347 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-config\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499376 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq4fg\" (UniqueName: \"kubernetes.io/projected/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-kube-api-access-gq4fg\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499400 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499448 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600077 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600134 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-dns-svc\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600164 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-config\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600194 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq4fg\" (UniqueName: \"kubernetes.io/projected/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-kube-api-access-gq4fg\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600268 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600337 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600976 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-dns-svc\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.601144 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.601226 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.601511 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.601517 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-config\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.602416 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.624405 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq4fg\" (UniqueName: \"kubernetes.io/projected/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-kube-api-access-gq4fg\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.785353 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.842759 4781 generic.go:334] "Generic (PLEG): container finished" podID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerID="e59534a993c981832971b7ef17c8f1e9f9d24b23112b98369b8ccf0ba58923af" exitCode=0 Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.842814 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" event={"ID":"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e","Type":"ContainerDied","Data":"e59534a993c981832971b7ef17c8f1e9f9d24b23112b98369b8ccf0ba58923af"} Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.962716 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.115405 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-nb\") pod \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.115478 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-sb\") pod \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.115555 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwcnr\" (UniqueName: \"kubernetes.io/projected/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-kube-api-access-cwcnr\") pod \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.115617 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-config\") pod \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.115687 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-swift-storage-0\") pod \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.115832 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-svc\") pod \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.132329 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-kube-api-access-cwcnr" (OuterVolumeSpecName: "kube-api-access-cwcnr") pod "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" (UID: "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e"). InnerVolumeSpecName "kube-api-access-cwcnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.203558 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-config" (OuterVolumeSpecName: "config") pod "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" (UID: "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.218851 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" (UID: "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.223213 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwcnr\" (UniqueName: \"kubernetes.io/projected/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-kube-api-access-cwcnr\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.223244 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.223257 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.236198 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" (UID: "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.264006 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" (UID: "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.275178 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" (UID: "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.324909 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.324944 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.324955 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.330012 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-9drr8"] Feb 27 00:31:46 crc kubenswrapper[4781]: W0227 00:31:46.330245 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01f0f26_7e7a_464f_8f50_4d49bf87cb46.slice/crio-c9e9d5f5376699e834cdeb5a1d46d883ebb4c9dad98d3930976901abe0fd539e WatchSource:0}: Error finding container c9e9d5f5376699e834cdeb5a1d46d883ebb4c9dad98d3930976901abe0fd539e: Status 404 returned error can't find the container with id c9e9d5f5376699e834cdeb5a1d46d883ebb4c9dad98d3930976901abe0fd539e Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.854167 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" event={"ID":"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e","Type":"ContainerDied","Data":"bc2ce3ec147ae0ea063d3bc5998697394125f3fd2381d07ae16cdf6df5227b71"} Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.854195 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.854546 4781 scope.go:117] "RemoveContainer" containerID="e59534a993c981832971b7ef17c8f1e9f9d24b23112b98369b8ccf0ba58923af" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.856116 4781 generic.go:334] "Generic (PLEG): container finished" podID="f01f0f26-7e7a-464f-8f50-4d49bf87cb46" containerID="222056db10424527f55219b5eb2c847209139d1f76c50f383e85073a4ddf04ff" exitCode=0 Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.856154 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" event={"ID":"f01f0f26-7e7a-464f-8f50-4d49bf87cb46","Type":"ContainerDied","Data":"222056db10424527f55219b5eb2c847209139d1f76c50f383e85073a4ddf04ff"} Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.856176 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" event={"ID":"f01f0f26-7e7a-464f-8f50-4d49bf87cb46","Type":"ContainerStarted","Data":"c9e9d5f5376699e834cdeb5a1d46d883ebb4c9dad98d3930976901abe0fd539e"} Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.887724 4781 scope.go:117] "RemoveContainer" containerID="0477def692642480b7baa681e79da18341ef273274b3570944d4f51dd3971947" Feb 27 00:31:47 crc kubenswrapper[4781]: I0227 00:31:47.069134 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-hm24r"] Feb 27 00:31:47 crc kubenswrapper[4781]: I0227 00:31:47.082254 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-hm24r"] Feb 27 00:31:47 crc kubenswrapper[4781]: I0227 00:31:47.324853 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" path="/var/lib/kubelet/pods/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e/volumes" Feb 27 00:31:47 crc kubenswrapper[4781]: I0227 00:31:47.867855 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" event={"ID":"f01f0f26-7e7a-464f-8f50-4d49bf87cb46","Type":"ContainerStarted","Data":"c9ec13fed5c820c01f8981c2700184d9e0874c5b37cbe966d1fb30046ce9b5df"} Feb 27 00:31:47 crc kubenswrapper[4781]: I0227 00:31:47.868479 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:47 crc kubenswrapper[4781]: I0227 00:31:47.887385 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" podStartSLOduration=2.887367998 podStartE2EDuration="2.887367998s" podCreationTimestamp="2026-02-27 00:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:31:47.883338989 +0000 UTC m=+1577.140878543" watchObservedRunningTime="2026-02-27 00:31:47.887367998 +0000 UTC m=+1577.144907552" Feb 27 00:31:50 crc kubenswrapper[4781]: E0227 00:31:50.627043 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice/crio-c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de\": RecentStats: unable to find data in memory cache]" Feb 27 00:31:51 crc kubenswrapper[4781]: I0227 00:31:51.620865 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 27 00:31:55 crc kubenswrapper[4781]: I0227 00:31:55.787815 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:55 crc kubenswrapper[4781]: I0227 00:31:55.858151 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-982rb"] Feb 27 00:31:55 crc kubenswrapper[4781]: I0227 00:31:55.858922 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerName="dnsmasq-dns" containerID="cri-o://f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21" gracePeriod=10 Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.518990 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649307 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-svc\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649480 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-openstack-edpm-ipam\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649525 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-sb\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649582 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-config\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649617 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-nb\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649730 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sdc6\" (UniqueName: \"kubernetes.io/projected/f81105ac-48e2-4b90-820a-8d7758ad3b33-kube-api-access-9sdc6\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649838 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-swift-storage-0\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.664909 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f81105ac-48e2-4b90-820a-8d7758ad3b33-kube-api-access-9sdc6" (OuterVolumeSpecName: "kube-api-access-9sdc6") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "kube-api-access-9sdc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.727118 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.733929 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.738348 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.753022 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.755086 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.755125 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.755137 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.755146 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.755156 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sdc6\" (UniqueName: \"kubernetes.io/projected/f81105ac-48e2-4b90-820a-8d7758ad3b33-kube-api-access-9sdc6\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.767290 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.767595 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-config" (OuterVolumeSpecName: "config") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.857366 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.857407 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.979757 4781 generic.go:334] "Generic (PLEG): container finished" podID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerID="f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21" exitCode=0 Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.979824 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" event={"ID":"f81105ac-48e2-4b90-820a-8d7758ad3b33","Type":"ContainerDied","Data":"f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21"} Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.979849 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" event={"ID":"f81105ac-48e2-4b90-820a-8d7758ad3b33","Type":"ContainerDied","Data":"68785af84dc6132ad668c9748f55cbc0790b34d7e605682887df4cda02988cd2"} Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.979864 4781 scope.go:117] "RemoveContainer" containerID="f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.979999 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.040605 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-982rb"] Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.040940 4781 scope.go:117] "RemoveContainer" containerID="f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583" Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.052202 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-982rb"] Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.075554 4781 scope.go:117] "RemoveContainer" containerID="f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21" Feb 27 00:31:57 crc kubenswrapper[4781]: E0227 00:31:57.076034 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21\": container with ID starting with f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21 not found: ID does not exist" containerID="f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21" Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.076099 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21"} err="failed to get container status \"f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21\": rpc error: code = NotFound desc = could not find container \"f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21\": container with ID starting with f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21 not found: ID does not exist" Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.076125 4781 scope.go:117] "RemoveContainer" containerID="f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583" Feb 27 00:31:57 crc kubenswrapper[4781]: E0227 00:31:57.076453 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583\": container with ID starting with f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583 not found: ID does not exist" containerID="f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583" Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.076485 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583"} err="failed to get container status \"f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583\": rpc error: code = NotFound desc = could not find container \"f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583\": container with ID starting with f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583 not found: ID does not exist" Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.320145 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" path="/var/lib/kubelet/pods/f81105ac-48e2-4b90-820a-8d7758ad3b33/volumes" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.153934 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535872-fpvhr"] Feb 27 00:32:00 crc kubenswrapper[4781]: E0227 00:32:00.154898 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerName="init" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.154914 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerName="init" Feb 27 00:32:00 crc kubenswrapper[4781]: E0227 00:32:00.154947 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerName="dnsmasq-dns" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.154953 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerName="dnsmasq-dns" Feb 27 00:32:00 crc kubenswrapper[4781]: E0227 00:32:00.154971 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerName="init" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.154978 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerName="init" Feb 27 00:32:00 crc kubenswrapper[4781]: E0227 00:32:00.154987 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerName="dnsmasq-dns" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.154993 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerName="dnsmasq-dns" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.155169 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerName="dnsmasq-dns" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.155180 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerName="dnsmasq-dns" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.155977 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.158355 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.158564 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.164346 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.177282 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535872-fpvhr"] Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.227753 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvkp2\" (UniqueName: \"kubernetes.io/projected/28ad6440-a4bb-43a6-985a-42979a799437-kube-api-access-cvkp2\") pod \"auto-csr-approver-29535872-fpvhr\" (UID: \"28ad6440-a4bb-43a6-985a-42979a799437\") " pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.330165 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvkp2\" (UniqueName: \"kubernetes.io/projected/28ad6440-a4bb-43a6-985a-42979a799437-kube-api-access-cvkp2\") pod \"auto-csr-approver-29535872-fpvhr\" (UID: \"28ad6440-a4bb-43a6-985a-42979a799437\") " pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.347813 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvkp2\" (UniqueName: \"kubernetes.io/projected/28ad6440-a4bb-43a6-985a-42979a799437-kube-api-access-cvkp2\") pod \"auto-csr-approver-29535872-fpvhr\" (UID: \"28ad6440-a4bb-43a6-985a-42979a799437\") " pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.490181 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:00 crc kubenswrapper[4781]: E0227 00:32:00.920705 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice/crio-c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice\": RecentStats: unable to find data in memory cache]" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.971271 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535872-fpvhr"] Feb 27 00:32:01 crc kubenswrapper[4781]: I0227 00:32:01.025242 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" event={"ID":"28ad6440-a4bb-43a6-985a-42979a799437","Type":"ContainerStarted","Data":"9e91a53fa0d7b46fa4d6db2c1af114047a98ba8b5905295dec631c90cc238eb5"} Feb 27 00:32:03 crc kubenswrapper[4781]: I0227 00:32:03.047350 4781 generic.go:334] "Generic (PLEG): container finished" podID="28ad6440-a4bb-43a6-985a-42979a799437" containerID="90d3da646bb32391ad6c504fecd5db68f89221b28accf451c40b52dc228b7d89" exitCode=0 Feb 27 00:32:03 crc kubenswrapper[4781]: I0227 00:32:03.047418 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" event={"ID":"28ad6440-a4bb-43a6-985a-42979a799437","Type":"ContainerDied","Data":"90d3da646bb32391ad6c504fecd5db68f89221b28accf451c40b52dc228b7d89"} Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.529604 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt"] Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.531177 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.534361 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.536202 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.536424 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.538400 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.548647 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt"] Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.583048 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.621898 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.622018 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.622041 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.622074 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjdrb\" (UniqueName: \"kubernetes.io/projected/05795337-1929-47d6-b63f-96d078b66c47-kube-api-access-gjdrb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.723531 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvkp2\" (UniqueName: \"kubernetes.io/projected/28ad6440-a4bb-43a6-985a-42979a799437-kube-api-access-cvkp2\") pod \"28ad6440-a4bb-43a6-985a-42979a799437\" (UID: \"28ad6440-a4bb-43a6-985a-42979a799437\") " Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.723859 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjdrb\" (UniqueName: \"kubernetes.io/projected/05795337-1929-47d6-b63f-96d078b66c47-kube-api-access-gjdrb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.723983 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.724066 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.724089 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.729649 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.730304 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.735204 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.744906 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ad6440-a4bb-43a6-985a-42979a799437-kube-api-access-cvkp2" (OuterVolumeSpecName: "kube-api-access-cvkp2") pod "28ad6440-a4bb-43a6-985a-42979a799437" (UID: "28ad6440-a4bb-43a6-985a-42979a799437"). InnerVolumeSpecName "kube-api-access-cvkp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.746807 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjdrb\" (UniqueName: \"kubernetes.io/projected/05795337-1929-47d6-b63f-96d078b66c47-kube-api-access-gjdrb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.826732 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvkp2\" (UniqueName: \"kubernetes.io/projected/28ad6440-a4bb-43a6-985a-42979a799437-kube-api-access-cvkp2\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.890824 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:05 crc kubenswrapper[4781]: I0227 00:32:05.079592 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" event={"ID":"28ad6440-a4bb-43a6-985a-42979a799437","Type":"ContainerDied","Data":"9e91a53fa0d7b46fa4d6db2c1af114047a98ba8b5905295dec631c90cc238eb5"} Feb 27 00:32:05 crc kubenswrapper[4781]: I0227 00:32:05.079663 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e91a53fa0d7b46fa4d6db2c1af114047a98ba8b5905295dec631c90cc238eb5" Feb 27 00:32:05 crc kubenswrapper[4781]: I0227 00:32:05.079730 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:05 crc kubenswrapper[4781]: I0227 00:32:05.562707 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt"] Feb 27 00:32:05 crc kubenswrapper[4781]: W0227 00:32:05.565106 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05795337_1929_47d6_b63f_96d078b66c47.slice/crio-758121309d739274e74f53a22e843b3e32bd2d388e94666e571e5c3e7026bff2 WatchSource:0}: Error finding container 758121309d739274e74f53a22e843b3e32bd2d388e94666e571e5c3e7026bff2: Status 404 returned error can't find the container with id 758121309d739274e74f53a22e843b3e32bd2d388e94666e571e5c3e7026bff2 Feb 27 00:32:05 crc kubenswrapper[4781]: I0227 00:32:05.707688 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535866-qpv8l"] Feb 27 00:32:05 crc kubenswrapper[4781]: I0227 00:32:05.721752 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535866-qpv8l"] Feb 27 00:32:06 crc kubenswrapper[4781]: I0227 00:32:06.090523 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" event={"ID":"05795337-1929-47d6-b63f-96d078b66c47","Type":"ContainerStarted","Data":"758121309d739274e74f53a22e843b3e32bd2d388e94666e571e5c3e7026bff2"} Feb 27 00:32:07 crc kubenswrapper[4781]: I0227 00:32:07.329995 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe4edac-acb6-4906-9b3b-42b7c7a98943" path="/var/lib/kubelet/pods/1fe4edac-acb6-4906-9b3b-42b7c7a98943/volumes" Feb 27 00:32:08 crc kubenswrapper[4781]: I0227 00:32:08.124703 4781 generic.go:334] "Generic (PLEG): container finished" podID="ed38e2f2-b350-4abd-abe2-859c9d504aa8" containerID="bfaee7ec7de3505b4e22cf4499593dc512858eb8d4ea24469079b8a31c14c355" exitCode=0 Feb 27 00:32:08 crc kubenswrapper[4781]: I0227 00:32:08.124785 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ed38e2f2-b350-4abd-abe2-859c9d504aa8","Type":"ContainerDied","Data":"bfaee7ec7de3505b4e22cf4499593dc512858eb8d4ea24469079b8a31c14c355"} Feb 27 00:32:09 crc kubenswrapper[4781]: I0227 00:32:09.147455 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ed38e2f2-b350-4abd-abe2-859c9d504aa8","Type":"ContainerStarted","Data":"02dd84684c9a248bca28815a43bafa3423f6e8c22db55a548880cac1191bbca2"} Feb 27 00:32:09 crc kubenswrapper[4781]: I0227 00:32:09.148212 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 27 00:32:09 crc kubenswrapper[4781]: I0227 00:32:09.171285 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.171269678 podStartE2EDuration="36.171269678s" podCreationTimestamp="2026-02-27 00:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:32:09.170298881 +0000 UTC m=+1598.427838435" watchObservedRunningTime="2026-02-27 00:32:09.171269678 +0000 UTC m=+1598.428809232" Feb 27 00:32:10 crc kubenswrapper[4781]: I0227 00:32:10.160617 4781 generic.go:334] "Generic (PLEG): container finished" podID="37519387-1738-4500-9953-52deba3e4a85" containerID="35670c776bc05f00a77eed47488c259dc1a1c6ce2969f6d1ea6d21ba78546cf9" exitCode=0 Feb 27 00:32:10 crc kubenswrapper[4781]: I0227 00:32:10.160686 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37519387-1738-4500-9953-52deba3e4a85","Type":"ContainerDied","Data":"35670c776bc05f00a77eed47488c259dc1a1c6ce2969f6d1ea6d21ba78546cf9"} Feb 27 00:32:11 crc kubenswrapper[4781]: E0227 00:32:11.223773 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice/crio-c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice\": RecentStats: unable to find data in memory cache]" Feb 27 00:32:12 crc kubenswrapper[4781]: I0227 00:32:12.110244 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.055860 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d8cst"] Feb 27 00:32:15 crc kubenswrapper[4781]: E0227 00:32:15.057242 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ad6440-a4bb-43a6-985a-42979a799437" containerName="oc" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.057266 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ad6440-a4bb-43a6-985a-42979a799437" containerName="oc" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.057678 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ad6440-a4bb-43a6-985a-42979a799437" containerName="oc" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.060558 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.066582 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d8cst"] Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.244433 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-utilities\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.244656 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-catalog-content\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.244698 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkhnm\" (UniqueName: \"kubernetes.io/projected/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-kube-api-access-hkhnm\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.346605 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-catalog-content\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.347176 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkhnm\" (UniqueName: \"kubernetes.io/projected/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-kube-api-access-hkhnm\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.347258 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-utilities\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.348306 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-utilities\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.348664 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-catalog-content\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.367107 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkhnm\" (UniqueName: \"kubernetes.io/projected/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-kube-api-access-hkhnm\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.389508 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: W0227 00:32:15.983832 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48b5a0c6_df06_4a8e_8f22_e17d79c0dcb2.slice/crio-ef28bee2e1ea85e9824d7aa816578f65afc6d5968cc3a2d777e02d92ca74b755 WatchSource:0}: Error finding container ef28bee2e1ea85e9824d7aa816578f65afc6d5968cc3a2d777e02d92ca74b755: Status 404 returned error can't find the container with id ef28bee2e1ea85e9824d7aa816578f65afc6d5968cc3a2d777e02d92ca74b755 Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.987864 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d8cst"] Feb 27 00:32:16 crc kubenswrapper[4781]: I0227 00:32:16.231134 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37519387-1738-4500-9953-52deba3e4a85","Type":"ContainerStarted","Data":"0048ebb6ce6c868afaed9d5bc7916d5f81a79b806ca8eaad5b59b8285b42b235"} Feb 27 00:32:16 crc kubenswrapper[4781]: I0227 00:32:16.231358 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:32:16 crc kubenswrapper[4781]: I0227 00:32:16.235052 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" event={"ID":"05795337-1929-47d6-b63f-96d078b66c47","Type":"ContainerStarted","Data":"f0acd75c80c39bafdd1bd55a70eff436e62d8c05f625f9a638a4cea0a03b81f1"} Feb 27 00:32:16 crc kubenswrapper[4781]: I0227 00:32:16.236512 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerStarted","Data":"ef28bee2e1ea85e9824d7aa816578f65afc6d5968cc3a2d777e02d92ca74b755"} Feb 27 00:32:16 crc kubenswrapper[4781]: I0227 00:32:16.281692 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.281672083 podStartE2EDuration="43.281672083s" podCreationTimestamp="2026-02-27 00:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:32:16.256937785 +0000 UTC m=+1605.514477339" watchObservedRunningTime="2026-02-27 00:32:16.281672083 +0000 UTC m=+1605.539211647" Feb 27 00:32:16 crc kubenswrapper[4781]: I0227 00:32:16.286042 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" podStartSLOduration=2.499810275 podStartE2EDuration="12.286033691s" podCreationTimestamp="2026-02-27 00:32:04 +0000 UTC" firstStartedPulling="2026-02-27 00:32:05.567188459 +0000 UTC m=+1594.824728003" lastFinishedPulling="2026-02-27 00:32:15.353411865 +0000 UTC m=+1604.610951419" observedRunningTime="2026-02-27 00:32:16.27820872 +0000 UTC m=+1605.535748274" watchObservedRunningTime="2026-02-27 00:32:16.286033691 +0000 UTC m=+1605.543573245" Feb 27 00:32:17 crc kubenswrapper[4781]: I0227 00:32:17.247396 4781 generic.go:334] "Generic (PLEG): container finished" podID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerID="0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9" exitCode=0 Feb 27 00:32:17 crc kubenswrapper[4781]: I0227 00:32:17.247513 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerDied","Data":"0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9"} Feb 27 00:32:19 crc kubenswrapper[4781]: I0227 00:32:19.272200 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerStarted","Data":"48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49"} Feb 27 00:32:21 crc kubenswrapper[4781]: I0227 00:32:21.302153 4781 generic.go:334] "Generic (PLEG): container finished" podID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerID="48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49" exitCode=0 Feb 27 00:32:21 crc kubenswrapper[4781]: I0227 00:32:21.302417 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerDied","Data":"48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49"} Feb 27 00:32:21 crc kubenswrapper[4781]: E0227 00:32:21.517260 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice/crio-c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de\": RecentStats: unable to find data in memory cache]" Feb 27 00:32:22 crc kubenswrapper[4781]: I0227 00:32:22.322121 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerStarted","Data":"e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f"} Feb 27 00:32:22 crc kubenswrapper[4781]: I0227 00:32:22.343639 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d8cst" podStartSLOduration=2.904209517 podStartE2EDuration="7.343604513s" podCreationTimestamp="2026-02-27 00:32:15 +0000 UTC" firstStartedPulling="2026-02-27 00:32:17.249011617 +0000 UTC m=+1606.506551171" lastFinishedPulling="2026-02-27 00:32:21.688406603 +0000 UTC m=+1610.945946167" observedRunningTime="2026-02-27 00:32:22.341495326 +0000 UTC m=+1611.599034890" watchObservedRunningTime="2026-02-27 00:32:22.343604513 +0000 UTC m=+1611.601144077" Feb 27 00:32:23 crc kubenswrapper[4781]: I0227 00:32:23.553100 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 27 00:32:25 crc kubenswrapper[4781]: I0227 00:32:25.389736 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:25 crc kubenswrapper[4781]: I0227 00:32:25.390309 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:25 crc kubenswrapper[4781]: I0227 00:32:25.442789 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:26 crc kubenswrapper[4781]: I0227 00:32:26.460752 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:26 crc kubenswrapper[4781]: I0227 00:32:26.510265 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d8cst"] Feb 27 00:32:26 crc kubenswrapper[4781]: I0227 00:32:26.975434 4781 scope.go:117] "RemoveContainer" containerID="e2bf980506549d387ee967a300bd50ff50a9e4489a44bcc2c952a5e2c00137a5" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.020246 4781 scope.go:117] "RemoveContainer" containerID="3d01f4c64b31dda5359f791eed0af9accdc107437765895fcc3cd585df0f55ae" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.047707 4781 scope.go:117] "RemoveContainer" containerID="c6e860c6c62b63e5a5fe835a4877c45040a36e7fc332cce5af395a3eaa5e24b1" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.115380 4781 scope.go:117] "RemoveContainer" containerID="a458867b742ce8b5b3fdd2c97ebf1845a6845fd00e046dd893821ec44de7237b" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.145181 4781 scope.go:117] "RemoveContainer" containerID="beeaff089c6577afca77da55c908132f8c47a3993cf1d2011eea873db182b172" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.198076 4781 scope.go:117] "RemoveContainer" containerID="31cd21a634eff04c79df7b5ee8d37fc4cdb1a4b5a72c57fc0d9aca1961c28780" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.240359 4781 scope.go:117] "RemoveContainer" containerID="0d295c8666e863d2c0e4e0d3a3e33356c58f61c54e944f8ced4d911133124bc0" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.294248 4781 scope.go:117] "RemoveContainer" containerID="490f54d4fc0654da6b5add2d9e470584271088a4fc9d0ff0972339bc97ab6f8f" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.330220 4781 scope.go:117] "RemoveContainer" containerID="28555d58f1fd114e239212917d6df64a83d89ed63bf1f65157974daf4ae101b8" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.381991 4781 generic.go:334] "Generic (PLEG): container finished" podID="05795337-1929-47d6-b63f-96d078b66c47" containerID="f0acd75c80c39bafdd1bd55a70eff436e62d8c05f625f9a638a4cea0a03b81f1" exitCode=0 Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.382051 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" event={"ID":"05795337-1929-47d6-b63f-96d078b66c47","Type":"ContainerDied","Data":"f0acd75c80c39bafdd1bd55a70eff436e62d8c05f625f9a638a4cea0a03b81f1"} Feb 27 00:32:28 crc kubenswrapper[4781]: I0227 00:32:28.401146 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d8cst" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="registry-server" containerID="cri-o://e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f" gracePeriod=2 Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.054957 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.063967 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181109 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-catalog-content\") pod \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181159 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjdrb\" (UniqueName: \"kubernetes.io/projected/05795337-1929-47d6-b63f-96d078b66c47-kube-api-access-gjdrb\") pod \"05795337-1929-47d6-b63f-96d078b66c47\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181237 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkhnm\" (UniqueName: \"kubernetes.io/projected/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-kube-api-access-hkhnm\") pod \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181261 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-ssh-key-openstack-edpm-ipam\") pod \"05795337-1929-47d6-b63f-96d078b66c47\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181313 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-inventory\") pod \"05795337-1929-47d6-b63f-96d078b66c47\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181417 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-repo-setup-combined-ca-bundle\") pod \"05795337-1929-47d6-b63f-96d078b66c47\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181550 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-utilities\") pod \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.182330 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-utilities" (OuterVolumeSpecName: "utilities") pod "48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" (UID: "48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.190517 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-kube-api-access-hkhnm" (OuterVolumeSpecName: "kube-api-access-hkhnm") pod "48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" (UID: "48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2"). InnerVolumeSpecName "kube-api-access-hkhnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.190894 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05795337-1929-47d6-b63f-96d078b66c47-kube-api-access-gjdrb" (OuterVolumeSpecName: "kube-api-access-gjdrb") pod "05795337-1929-47d6-b63f-96d078b66c47" (UID: "05795337-1929-47d6-b63f-96d078b66c47"). InnerVolumeSpecName "kube-api-access-gjdrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.194807 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "05795337-1929-47d6-b63f-96d078b66c47" (UID: "05795337-1929-47d6-b63f-96d078b66c47"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.217352 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-inventory" (OuterVolumeSpecName: "inventory") pod "05795337-1929-47d6-b63f-96d078b66c47" (UID: "05795337-1929-47d6-b63f-96d078b66c47"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.237823 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "05795337-1929-47d6-b63f-96d078b66c47" (UID: "05795337-1929-47d6-b63f-96d078b66c47"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.239730 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" (UID: "48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.283900 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkhnm\" (UniqueName: \"kubernetes.io/projected/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-kube-api-access-hkhnm\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.283948 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.283969 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.283990 4781 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.284008 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.284026 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.284046 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjdrb\" (UniqueName: \"kubernetes.io/projected/05795337-1929-47d6-b63f-96d078b66c47-kube-api-access-gjdrb\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.414845 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" event={"ID":"05795337-1929-47d6-b63f-96d078b66c47","Type":"ContainerDied","Data":"758121309d739274e74f53a22e843b3e32bd2d388e94666e571e5c3e7026bff2"} Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.414894 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="758121309d739274e74f53a22e843b3e32bd2d388e94666e571e5c3e7026bff2" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.414946 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.419970 4781 generic.go:334] "Generic (PLEG): container finished" podID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerID="e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f" exitCode=0 Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.420022 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.420030 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerDied","Data":"e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f"} Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.420088 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerDied","Data":"ef28bee2e1ea85e9824d7aa816578f65afc6d5968cc3a2d777e02d92ca74b755"} Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.420113 4781 scope.go:117] "RemoveContainer" containerID="e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.445923 4781 scope.go:117] "RemoveContainer" containerID="48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.468536 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d8cst"] Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.483592 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d8cst"] Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.484272 4781 scope.go:117] "RemoveContainer" containerID="0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.497086 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4"] Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.497810 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="extract-utilities" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.497830 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="extract-utilities" Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.497857 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05795337-1929-47d6-b63f-96d078b66c47" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.497864 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="05795337-1929-47d6-b63f-96d078b66c47" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.497874 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="registry-server" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.497882 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="registry-server" Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.497903 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="extract-content" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.497909 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="extract-content" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.498108 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="registry-server" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.498125 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="05795337-1929-47d6-b63f-96d078b66c47" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.498890 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.501688 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.501979 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.503388 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.509822 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4"] Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.510226 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.517440 4781 scope.go:117] "RemoveContainer" containerID="e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f" Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.519598 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f\": container with ID starting with e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f not found: ID does not exist" containerID="e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.519758 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f"} err="failed to get container status \"e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f\": rpc error: code = NotFound desc = could not find container \"e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f\": container with ID starting with e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f not found: ID does not exist" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.519792 4781 scope.go:117] "RemoveContainer" containerID="48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49" Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.520254 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49\": container with ID starting with 48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49 not found: ID does not exist" containerID="48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.520306 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49"} err="failed to get container status \"48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49\": rpc error: code = NotFound desc = could not find container \"48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49\": container with ID starting with 48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49 not found: ID does not exist" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.520339 4781 scope.go:117] "RemoveContainer" containerID="0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9" Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.520688 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9\": container with ID starting with 0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9 not found: ID does not exist" containerID="0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.520719 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9"} err="failed to get container status \"0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9\": rpc error: code = NotFound desc = could not find container \"0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9\": container with ID starting with 0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9 not found: ID does not exist" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.692734 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.693066 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.693226 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q64vj\" (UniqueName: \"kubernetes.io/projected/ca27d369-00b1-47ec-88cc-87d4a7065356-kube-api-access-q64vj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.794983 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.795071 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.795116 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q64vj\" (UniqueName: \"kubernetes.io/projected/ca27d369-00b1-47ec-88cc-87d4a7065356-kube-api-access-q64vj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.800180 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.801452 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.813493 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q64vj\" (UniqueName: \"kubernetes.io/projected/ca27d369-00b1-47ec-88cc-87d4a7065356-kube-api-access-q64vj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.879284 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:30 crc kubenswrapper[4781]: I0227 00:32:30.599411 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4"] Feb 27 00:32:30 crc kubenswrapper[4781]: W0227 00:32:30.607762 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca27d369_00b1_47ec_88cc_87d4a7065356.slice/crio-d04845bd198b474e4b0be70d06fbe5733ebe209b4679dc9cc937514a68b44e80 WatchSource:0}: Error finding container d04845bd198b474e4b0be70d06fbe5733ebe209b4679dc9cc937514a68b44e80: Status 404 returned error can't find the container with id d04845bd198b474e4b0be70d06fbe5733ebe209b4679dc9cc937514a68b44e80 Feb 27 00:32:31 crc kubenswrapper[4781]: I0227 00:32:31.325835 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" path="/var/lib/kubelet/pods/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2/volumes" Feb 27 00:32:31 crc kubenswrapper[4781]: I0227 00:32:31.440493 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" event={"ID":"ca27d369-00b1-47ec-88cc-87d4a7065356","Type":"ContainerStarted","Data":"7ffe4cc8d82e6022d765751060e689bfec6ab82af27a06b7cad02fcc0dcc8cb1"} Feb 27 00:32:31 crc kubenswrapper[4781]: I0227 00:32:31.440543 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" event={"ID":"ca27d369-00b1-47ec-88cc-87d4a7065356","Type":"ContainerStarted","Data":"d04845bd198b474e4b0be70d06fbe5733ebe209b4679dc9cc937514a68b44e80"} Feb 27 00:32:31 crc kubenswrapper[4781]: I0227 00:32:31.460272 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" podStartSLOduration=2.03062625 podStartE2EDuration="2.460251247s" podCreationTimestamp="2026-02-27 00:32:29 +0000 UTC" firstStartedPulling="2026-02-27 00:32:30.612162183 +0000 UTC m=+1619.869701747" lastFinishedPulling="2026-02-27 00:32:31.0417872 +0000 UTC m=+1620.299326744" observedRunningTime="2026-02-27 00:32:31.456382777 +0000 UTC m=+1620.713922341" watchObservedRunningTime="2026-02-27 00:32:31.460251247 +0000 UTC m=+1620.717790801" Feb 27 00:32:34 crc kubenswrapper[4781]: I0227 00:32:34.475702 4781 generic.go:334] "Generic (PLEG): container finished" podID="ca27d369-00b1-47ec-88cc-87d4a7065356" containerID="7ffe4cc8d82e6022d765751060e689bfec6ab82af27a06b7cad02fcc0dcc8cb1" exitCode=0 Feb 27 00:32:34 crc kubenswrapper[4781]: I0227 00:32:34.475814 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" event={"ID":"ca27d369-00b1-47ec-88cc-87d4a7065356","Type":"ContainerDied","Data":"7ffe4cc8d82e6022d765751060e689bfec6ab82af27a06b7cad02fcc0dcc8cb1"} Feb 27 00:32:34 crc kubenswrapper[4781]: I0227 00:32:34.494768 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.056642 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.197795 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q64vj\" (UniqueName: \"kubernetes.io/projected/ca27d369-00b1-47ec-88cc-87d4a7065356-kube-api-access-q64vj\") pod \"ca27d369-00b1-47ec-88cc-87d4a7065356\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.198670 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-ssh-key-openstack-edpm-ipam\") pod \"ca27d369-00b1-47ec-88cc-87d4a7065356\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.198764 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-inventory\") pod \"ca27d369-00b1-47ec-88cc-87d4a7065356\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.209917 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca27d369-00b1-47ec-88cc-87d4a7065356-kube-api-access-q64vj" (OuterVolumeSpecName: "kube-api-access-q64vj") pod "ca27d369-00b1-47ec-88cc-87d4a7065356" (UID: "ca27d369-00b1-47ec-88cc-87d4a7065356"). InnerVolumeSpecName "kube-api-access-q64vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.233825 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ca27d369-00b1-47ec-88cc-87d4a7065356" (UID: "ca27d369-00b1-47ec-88cc-87d4a7065356"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.300831 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-inventory" (OuterVolumeSpecName: "inventory") pod "ca27d369-00b1-47ec-88cc-87d4a7065356" (UID: "ca27d369-00b1-47ec-88cc-87d4a7065356"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.300894 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q64vj\" (UniqueName: \"kubernetes.io/projected/ca27d369-00b1-47ec-88cc-87d4a7065356-kube-api-access-q64vj\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.300912 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.402891 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.499920 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" event={"ID":"ca27d369-00b1-47ec-88cc-87d4a7065356","Type":"ContainerDied","Data":"d04845bd198b474e4b0be70d06fbe5733ebe209b4679dc9cc937514a68b44e80"} Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.500206 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d04845bd198b474e4b0be70d06fbe5733ebe209b4679dc9cc937514a68b44e80" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.499989 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.609814 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp"] Feb 27 00:32:36 crc kubenswrapper[4781]: E0227 00:32:36.610374 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca27d369-00b1-47ec-88cc-87d4a7065356" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.610398 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca27d369-00b1-47ec-88cc-87d4a7065356" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.610709 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca27d369-00b1-47ec-88cc-87d4a7065356" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.611688 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.613668 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.614498 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.614638 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.614751 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.619740 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp"] Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.708659 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.708942 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9qv\" (UniqueName: \"kubernetes.io/projected/94c301c2-f624-44a1-ad01-7d60748c5fca-kube-api-access-7f9qv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.709020 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.709320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.811113 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.811169 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.811218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9qv\" (UniqueName: \"kubernetes.io/projected/94c301c2-f624-44a1-ad01-7d60748c5fca-kube-api-access-7f9qv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.811238 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.815059 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.815554 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.815587 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.837171 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9qv\" (UniqueName: \"kubernetes.io/projected/94c301c2-f624-44a1-ad01-7d60748c5fca-kube-api-access-7f9qv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.972797 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:37 crc kubenswrapper[4781]: I0227 00:32:37.511221 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp"] Feb 27 00:32:37 crc kubenswrapper[4781]: W0227 00:32:37.512552 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94c301c2_f624_44a1_ad01_7d60748c5fca.slice/crio-0f554d217df45e166b4e450e9197072feb57a19920193828d93e3c4d8a9aab0f WatchSource:0}: Error finding container 0f554d217df45e166b4e450e9197072feb57a19920193828d93e3c4d8a9aab0f: Status 404 returned error can't find the container with id 0f554d217df45e166b4e450e9197072feb57a19920193828d93e3c4d8a9aab0f Feb 27 00:32:38 crc kubenswrapper[4781]: I0227 00:32:38.523292 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" event={"ID":"94c301c2-f624-44a1-ad01-7d60748c5fca","Type":"ContainerStarted","Data":"0f554d217df45e166b4e450e9197072feb57a19920193828d93e3c4d8a9aab0f"} Feb 27 00:32:39 crc kubenswrapper[4781]: I0227 00:32:39.535276 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" event={"ID":"94c301c2-f624-44a1-ad01-7d60748c5fca","Type":"ContainerStarted","Data":"c9f410e8ea0a201af9b55d546472dc35416b2a24d8046632364c67fede87b408"} Feb 27 00:32:39 crc kubenswrapper[4781]: I0227 00:32:39.553980 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" podStartSLOduration=2.432097821 podStartE2EDuration="3.553960061s" podCreationTimestamp="2026-02-27 00:32:36 +0000 UTC" firstStartedPulling="2026-02-27 00:32:37.516010923 +0000 UTC m=+1626.773550487" lastFinishedPulling="2026-02-27 00:32:38.637873163 +0000 UTC m=+1627.895412727" observedRunningTime="2026-02-27 00:32:39.548308234 +0000 UTC m=+1628.805847818" watchObservedRunningTime="2026-02-27 00:32:39.553960061 +0000 UTC m=+1628.811499615" Feb 27 00:33:27 crc kubenswrapper[4781]: I0227 00:33:27.642172 4781 scope.go:117] "RemoveContainer" containerID="914d10b311f6e761cfe3376de0d9169e16d04822bd5c0495a9b64cbbe456b1f4" Feb 27 00:33:27 crc kubenswrapper[4781]: I0227 00:33:27.693409 4781 scope.go:117] "RemoveContainer" containerID="da1dbeb22d52f0e9e8028b046b421ef782d44fa0719cff0b4421d346eb2fd5aa" Feb 27 00:33:42 crc kubenswrapper[4781]: I0227 00:33:42.895512 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:33:42 crc kubenswrapper[4781]: I0227 00:33:42.896870 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.166481 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535874-9b4fw"] Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.169667 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.175838 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.176067 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.176253 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.180372 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535874-9b4fw"] Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.277166 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5szk9\" (UniqueName: \"kubernetes.io/projected/21bdad75-a7e5-4940-9ee3-be513a55b97d-kube-api-access-5szk9\") pod \"auto-csr-approver-29535874-9b4fw\" (UID: \"21bdad75-a7e5-4940-9ee3-be513a55b97d\") " pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.380507 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5szk9\" (UniqueName: \"kubernetes.io/projected/21bdad75-a7e5-4940-9ee3-be513a55b97d-kube-api-access-5szk9\") pod \"auto-csr-approver-29535874-9b4fw\" (UID: \"21bdad75-a7e5-4940-9ee3-be513a55b97d\") " pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.412986 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5szk9\" (UniqueName: \"kubernetes.io/projected/21bdad75-a7e5-4940-9ee3-be513a55b97d-kube-api-access-5szk9\") pod \"auto-csr-approver-29535874-9b4fw\" (UID: \"21bdad75-a7e5-4940-9ee3-be513a55b97d\") " pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.507037 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:01 crc kubenswrapper[4781]: I0227 00:34:01.036909 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535874-9b4fw"] Feb 27 00:34:01 crc kubenswrapper[4781]: W0227 00:34:01.043729 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21bdad75_a7e5_4940_9ee3_be513a55b97d.slice/crio-c1c1fc30ddbbac44afe2638670904d909d0b2d5a07a5af4aea258b07d7edab17 WatchSource:0}: Error finding container c1c1fc30ddbbac44afe2638670904d909d0b2d5a07a5af4aea258b07d7edab17: Status 404 returned error can't find the container with id c1c1fc30ddbbac44afe2638670904d909d0b2d5a07a5af4aea258b07d7edab17 Feb 27 00:34:01 crc kubenswrapper[4781]: I0227 00:34:01.587943 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" event={"ID":"21bdad75-a7e5-4940-9ee3-be513a55b97d","Type":"ContainerStarted","Data":"c1c1fc30ddbbac44afe2638670904d909d0b2d5a07a5af4aea258b07d7edab17"} Feb 27 00:34:02 crc kubenswrapper[4781]: I0227 00:34:02.598895 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" event={"ID":"21bdad75-a7e5-4940-9ee3-be513a55b97d","Type":"ContainerStarted","Data":"172b3310c26572010bb7e76f998ac931b571b090edac45e7e85d3b3c5cd6c47d"} Feb 27 00:34:02 crc kubenswrapper[4781]: I0227 00:34:02.624525 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" podStartSLOduration=1.634584515 podStartE2EDuration="2.624498816s" podCreationTimestamp="2026-02-27 00:34:00 +0000 UTC" firstStartedPulling="2026-02-27 00:34:01.058497073 +0000 UTC m=+1710.316036627" lastFinishedPulling="2026-02-27 00:34:02.048411354 +0000 UTC m=+1711.305950928" observedRunningTime="2026-02-27 00:34:02.617007552 +0000 UTC m=+1711.874547106" watchObservedRunningTime="2026-02-27 00:34:02.624498816 +0000 UTC m=+1711.882038400" Feb 27 00:34:03 crc kubenswrapper[4781]: I0227 00:34:03.624424 4781 generic.go:334] "Generic (PLEG): container finished" podID="21bdad75-a7e5-4940-9ee3-be513a55b97d" containerID="172b3310c26572010bb7e76f998ac931b571b090edac45e7e85d3b3c5cd6c47d" exitCode=0 Feb 27 00:34:03 crc kubenswrapper[4781]: I0227 00:34:03.624814 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" event={"ID":"21bdad75-a7e5-4940-9ee3-be513a55b97d","Type":"ContainerDied","Data":"172b3310c26572010bb7e76f998ac931b571b090edac45e7e85d3b3c5cd6c47d"} Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.104004 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.296462 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5szk9\" (UniqueName: \"kubernetes.io/projected/21bdad75-a7e5-4940-9ee3-be513a55b97d-kube-api-access-5szk9\") pod \"21bdad75-a7e5-4940-9ee3-be513a55b97d\" (UID: \"21bdad75-a7e5-4940-9ee3-be513a55b97d\") " Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.302264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21bdad75-a7e5-4940-9ee3-be513a55b97d-kube-api-access-5szk9" (OuterVolumeSpecName: "kube-api-access-5szk9") pod "21bdad75-a7e5-4940-9ee3-be513a55b97d" (UID: "21bdad75-a7e5-4940-9ee3-be513a55b97d"). InnerVolumeSpecName "kube-api-access-5szk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.398567 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5szk9\" (UniqueName: \"kubernetes.io/projected/21bdad75-a7e5-4940-9ee3-be513a55b97d-kube-api-access-5szk9\") on node \"crc\" DevicePath \"\"" Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.654114 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" event={"ID":"21bdad75-a7e5-4940-9ee3-be513a55b97d","Type":"ContainerDied","Data":"c1c1fc30ddbbac44afe2638670904d909d0b2d5a07a5af4aea258b07d7edab17"} Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.654181 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1c1fc30ddbbac44afe2638670904d909d0b2d5a07a5af4aea258b07d7edab17" Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.654267 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.719839 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535868-f5csp"] Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.734023 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535868-f5csp"] Feb 27 00:34:07 crc kubenswrapper[4781]: I0227 00:34:07.327480 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3df72f1-7ac9-4877-a7b4-a17b5c724303" path="/var/lib/kubelet/pods/f3df72f1-7ac9-4877-a7b4-a17b5c724303/volumes" Feb 27 00:34:12 crc kubenswrapper[4781]: I0227 00:34:12.895979 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:34:12 crc kubenswrapper[4781]: I0227 00:34:12.896546 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:34:27 crc kubenswrapper[4781]: I0227 00:34:27.809392 4781 scope.go:117] "RemoveContainer" containerID="5c6246746a3c78078a59adb64a2979be72d82f5cfd95c152a4db993cadaf1efe" Feb 27 00:34:27 crc kubenswrapper[4781]: I0227 00:34:27.842591 4781 scope.go:117] "RemoveContainer" containerID="8eb943556508c5cc9103fa044300406224b9b4973d8e501d8f7538f1c3573e24" Feb 27 00:34:27 crc kubenswrapper[4781]: I0227 00:34:27.914910 4781 scope.go:117] "RemoveContainer" containerID="6964fd56259850480217527d40244a043795966342292bb5a943a33534e5489f" Feb 27 00:34:27 crc kubenswrapper[4781]: I0227 00:34:27.957379 4781 scope.go:117] "RemoveContainer" containerID="58983f3a0d32568b0a106e31b532196dd7e3e78ec29a99f5dc4c44649ec4e605" Feb 27 00:34:28 crc kubenswrapper[4781]: I0227 00:34:28.008415 4781 scope.go:117] "RemoveContainer" containerID="7aaaa3159dfec72ce2bfd72718ace0516b0de685b4c75d813a19d16d4226019b" Feb 27 00:34:42 crc kubenswrapper[4781]: I0227 00:34:42.895987 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:34:42 crc kubenswrapper[4781]: I0227 00:34:42.897085 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:34:42 crc kubenswrapper[4781]: I0227 00:34:42.897172 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:34:42 crc kubenswrapper[4781]: I0227 00:34:42.898677 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:34:42 crc kubenswrapper[4781]: I0227 00:34:42.898754 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" gracePeriod=600 Feb 27 00:34:43 crc kubenswrapper[4781]: E0227 00:34:43.029954 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:34:43 crc kubenswrapper[4781]: I0227 00:34:43.101681 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" exitCode=0 Feb 27 00:34:43 crc kubenswrapper[4781]: I0227 00:34:43.101737 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f"} Feb 27 00:34:43 crc kubenswrapper[4781]: I0227 00:34:43.101775 4781 scope.go:117] "RemoveContainer" containerID="18f81d6f38ae3802e83160171263bed0ca095345d87ab2807429711c0c761818" Feb 27 00:34:43 crc kubenswrapper[4781]: I0227 00:34:43.102755 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:34:43 crc kubenswrapper[4781]: E0227 00:34:43.103079 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:34:56 crc kubenswrapper[4781]: I0227 00:34:56.309655 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:34:56 crc kubenswrapper[4781]: E0227 00:34:56.310400 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:35:05 crc kubenswrapper[4781]: E0227 00:35:05.935752 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Feb 27 00:35:07 crc kubenswrapper[4781]: I0227 00:35:07.309855 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:35:07 crc kubenswrapper[4781]: E0227 00:35:07.310366 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:35:18 crc kubenswrapper[4781]: I0227 00:35:18.309915 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:35:18 crc kubenswrapper[4781]: E0227 00:35:18.310864 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.161766 4781 scope.go:117] "RemoveContainer" containerID="7a5345b65b014bc9d0e2cd844013d91d1a91d4e408c41f0c7f4f964de80130f6" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.195808 4781 scope.go:117] "RemoveContainer" containerID="cfcdb38663d80d12b7e86a05dfe2ce7cc23ff17e6af4e336ba2f0e4a180806c3" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.223319 4781 scope.go:117] "RemoveContainer" containerID="16ba8a242e20589655027929d1c82fa25c3d9fc988018051237357efea8a8ec9" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.247394 4781 scope.go:117] "RemoveContainer" containerID="9155e1f68a6370d2a59d952aff96914080df4756a62f18bb9bbc3ec507e49ef4" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.266200 4781 scope.go:117] "RemoveContainer" containerID="7cb922ac2fcfd76994a7254d975044d1fe0a7563db3547acc86bfb78f94c47a2" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.289918 4781 scope.go:117] "RemoveContainer" containerID="bfa97c01ece2e8cbadd8eda7e12994d67d495e411ba60ed25dc9b412019a8f03" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.319057 4781 scope.go:117] "RemoveContainer" containerID="be2fe215086cd4058aea52c301ed09e04ac3143d7e54d38772b785701e47e5f8" Feb 27 00:35:29 crc kubenswrapper[4781]: I0227 00:35:29.309685 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:35:29 crc kubenswrapper[4781]: E0227 00:35:29.310110 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:35:32 crc kubenswrapper[4781]: I0227 00:35:32.661084 4781 generic.go:334] "Generic (PLEG): container finished" podID="94c301c2-f624-44a1-ad01-7d60748c5fca" containerID="c9f410e8ea0a201af9b55d546472dc35416b2a24d8046632364c67fede87b408" exitCode=0 Feb 27 00:35:32 crc kubenswrapper[4781]: I0227 00:35:32.661161 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" event={"ID":"94c301c2-f624-44a1-ad01-7d60748c5fca","Type":"ContainerDied","Data":"c9f410e8ea0a201af9b55d546472dc35416b2a24d8046632364c67fede87b408"} Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.188096 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.338041 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-inventory\") pod \"94c301c2-f624-44a1-ad01-7d60748c5fca\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.338303 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f9qv\" (UniqueName: \"kubernetes.io/projected/94c301c2-f624-44a1-ad01-7d60748c5fca-kube-api-access-7f9qv\") pod \"94c301c2-f624-44a1-ad01-7d60748c5fca\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.338419 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-ssh-key-openstack-edpm-ipam\") pod \"94c301c2-f624-44a1-ad01-7d60748c5fca\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.338469 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-bootstrap-combined-ca-bundle\") pod \"94c301c2-f624-44a1-ad01-7d60748c5fca\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.345144 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c301c2-f624-44a1-ad01-7d60748c5fca-kube-api-access-7f9qv" (OuterVolumeSpecName: "kube-api-access-7f9qv") pod "94c301c2-f624-44a1-ad01-7d60748c5fca" (UID: "94c301c2-f624-44a1-ad01-7d60748c5fca"). InnerVolumeSpecName "kube-api-access-7f9qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.349173 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "94c301c2-f624-44a1-ad01-7d60748c5fca" (UID: "94c301c2-f624-44a1-ad01-7d60748c5fca"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.375136 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "94c301c2-f624-44a1-ad01-7d60748c5fca" (UID: "94c301c2-f624-44a1-ad01-7d60748c5fca"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.378779 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-inventory" (OuterVolumeSpecName: "inventory") pod "94c301c2-f624-44a1-ad01-7d60748c5fca" (UID: "94c301c2-f624-44a1-ad01-7d60748c5fca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.440890 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f9qv\" (UniqueName: \"kubernetes.io/projected/94c301c2-f624-44a1-ad01-7d60748c5fca-kube-api-access-7f9qv\") on node \"crc\" DevicePath \"\"" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.440930 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.440943 4781 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.440954 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.682380 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" event={"ID":"94c301c2-f624-44a1-ad01-7d60748c5fca","Type":"ContainerDied","Data":"0f554d217df45e166b4e450e9197072feb57a19920193828d93e3c4d8a9aab0f"} Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.682423 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f554d217df45e166b4e450e9197072feb57a19920193828d93e3c4d8a9aab0f" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.682501 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.807086 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs"] Feb 27 00:35:34 crc kubenswrapper[4781]: E0227 00:35:34.807599 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c301c2-f624-44a1-ad01-7d60748c5fca" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.807619 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c301c2-f624-44a1-ad01-7d60748c5fca" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 27 00:35:34 crc kubenswrapper[4781]: E0227 00:35:34.807660 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21bdad75-a7e5-4940-9ee3-be513a55b97d" containerName="oc" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.807668 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="21bdad75-a7e5-4940-9ee3-be513a55b97d" containerName="oc" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.807904 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c301c2-f624-44a1-ad01-7d60748c5fca" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.807921 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="21bdad75-a7e5-4940-9ee3-be513a55b97d" containerName="oc" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.808699 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.826618 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs"] Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.827158 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.827261 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.827552 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.828177 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.983804 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.984124 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qgqr\" (UniqueName: \"kubernetes.io/projected/756e2fbc-556d-44b8-8820-e469ae73ff3b-kube-api-access-2qgqr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.984259 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.085997 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.086568 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.086845 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qgqr\" (UniqueName: \"kubernetes.io/projected/756e2fbc-556d-44b8-8820-e469ae73ff3b-kube-api-access-2qgqr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.089753 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.091918 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.103028 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qgqr\" (UniqueName: \"kubernetes.io/projected/756e2fbc-556d-44b8-8820-e469ae73ff3b-kube-api-access-2qgqr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.137893 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.767033 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs"] Feb 27 00:35:36 crc kubenswrapper[4781]: I0227 00:35:36.706748 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" event={"ID":"756e2fbc-556d-44b8-8820-e469ae73ff3b","Type":"ContainerStarted","Data":"2381181a031f2c79d017f2baa667d5dd32b801106d0766673275724fedbb49d5"} Feb 27 00:35:36 crc kubenswrapper[4781]: I0227 00:35:36.707466 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" event={"ID":"756e2fbc-556d-44b8-8820-e469ae73ff3b","Type":"ContainerStarted","Data":"3f5e71502ea441afeaae2337c126ac0cba6512e3c3bd88e9bd95ee0b8fb3c58b"} Feb 27 00:35:36 crc kubenswrapper[4781]: I0227 00:35:36.746908 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" podStartSLOduration=2.335755388 podStartE2EDuration="2.746885645s" podCreationTimestamp="2026-02-27 00:35:34 +0000 UTC" firstStartedPulling="2026-02-27 00:35:35.772004435 +0000 UTC m=+1805.029543989" lastFinishedPulling="2026-02-27 00:35:36.183134692 +0000 UTC m=+1805.440674246" observedRunningTime="2026-02-27 00:35:36.720915502 +0000 UTC m=+1805.978455066" watchObservedRunningTime="2026-02-27 00:35:36.746885645 +0000 UTC m=+1806.004425199" Feb 27 00:35:41 crc kubenswrapper[4781]: I0227 00:35:41.319682 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:35:41 crc kubenswrapper[4781]: E0227 00:35:41.320550 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:35:55 crc kubenswrapper[4781]: I0227 00:35:55.310084 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:35:55 crc kubenswrapper[4781]: E0227 00:35:55.310954 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.163254 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535876-2l88l"] Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.165450 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.167250 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.167752 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.173357 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.188765 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535876-2l88l"] Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.273817 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwns7\" (UniqueName: \"kubernetes.io/projected/f9301966-9820-4623-8393-f185a0616743-kube-api-access-rwns7\") pod \"auto-csr-approver-29535876-2l88l\" (UID: \"f9301966-9820-4623-8393-f185a0616743\") " pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.375936 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwns7\" (UniqueName: \"kubernetes.io/projected/f9301966-9820-4623-8393-f185a0616743-kube-api-access-rwns7\") pod \"auto-csr-approver-29535876-2l88l\" (UID: \"f9301966-9820-4623-8393-f185a0616743\") " pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.394405 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwns7\" (UniqueName: \"kubernetes.io/projected/f9301966-9820-4623-8393-f185a0616743-kube-api-access-rwns7\") pod \"auto-csr-approver-29535876-2l88l\" (UID: \"f9301966-9820-4623-8393-f185a0616743\") " pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.484970 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:00 crc kubenswrapper[4781]: W0227 00:36:00.980946 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9301966_9820_4623_8393_f185a0616743.slice/crio-2ad68abae152e8f397d403a5bdc5e8e0dcddc147c700af7f42bfe61626d6ee95 WatchSource:0}: Error finding container 2ad68abae152e8f397d403a5bdc5e8e0dcddc147c700af7f42bfe61626d6ee95: Status 404 returned error can't find the container with id 2ad68abae152e8f397d403a5bdc5e8e0dcddc147c700af7f42bfe61626d6ee95 Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.991815 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535876-2l88l"] Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.998280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535876-2l88l" event={"ID":"f9301966-9820-4623-8393-f185a0616743","Type":"ContainerStarted","Data":"2ad68abae152e8f397d403a5bdc5e8e0dcddc147c700af7f42bfe61626d6ee95"} Feb 27 00:36:03 crc kubenswrapper[4781]: I0227 00:36:03.020726 4781 generic.go:334] "Generic (PLEG): container finished" podID="f9301966-9820-4623-8393-f185a0616743" containerID="53c40723095bbd1b6e5cbec68ec5b0fac1a46ad7d3ad91a7ae622222a7ca48d5" exitCode=0 Feb 27 00:36:03 crc kubenswrapper[4781]: I0227 00:36:03.020830 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535876-2l88l" event={"ID":"f9301966-9820-4623-8393-f185a0616743","Type":"ContainerDied","Data":"53c40723095bbd1b6e5cbec68ec5b0fac1a46ad7d3ad91a7ae622222a7ca48d5"} Feb 27 00:36:04 crc kubenswrapper[4781]: I0227 00:36:04.532287 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:04 crc kubenswrapper[4781]: I0227 00:36:04.681819 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwns7\" (UniqueName: \"kubernetes.io/projected/f9301966-9820-4623-8393-f185a0616743-kube-api-access-rwns7\") pod \"f9301966-9820-4623-8393-f185a0616743\" (UID: \"f9301966-9820-4623-8393-f185a0616743\") " Feb 27 00:36:04 crc kubenswrapper[4781]: I0227 00:36:04.688066 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9301966-9820-4623-8393-f185a0616743-kube-api-access-rwns7" (OuterVolumeSpecName: "kube-api-access-rwns7") pod "f9301966-9820-4623-8393-f185a0616743" (UID: "f9301966-9820-4623-8393-f185a0616743"). InnerVolumeSpecName "kube-api-access-rwns7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:36:04 crc kubenswrapper[4781]: I0227 00:36:04.784965 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwns7\" (UniqueName: \"kubernetes.io/projected/f9301966-9820-4623-8393-f185a0616743-kube-api-access-rwns7\") on node \"crc\" DevicePath \"\"" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.046323 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535876-2l88l" event={"ID":"f9301966-9820-4623-8393-f185a0616743","Type":"ContainerDied","Data":"2ad68abae152e8f397d403a5bdc5e8e0dcddc147c700af7f42bfe61626d6ee95"} Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.046700 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ad68abae152e8f397d403a5bdc5e8e0dcddc147c700af7f42bfe61626d6ee95" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.046376 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.102166 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-94sd2"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.116172 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-547a-account-create-update-tf2pb"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.128180 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jrxqx"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.136823 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5255-account-create-update-k87hd"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.146745 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-94sd2"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.155761 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-f66vm"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.164802 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5255-account-create-update-k87hd"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.174151 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jrxqx"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.183579 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-547a-account-create-update-tf2pb"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.196791 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-f66vm"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.333361 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cec0cd3-abcd-484c-85b8-03a44888a9b7" path="/var/lib/kubelet/pods/0cec0cd3-abcd-484c-85b8-03a44888a9b7/volumes" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.334034 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c6016e5-2641-4b82-b164-121ae822f863" path="/var/lib/kubelet/pods/2c6016e5-2641-4b82-b164-121ae822f863/volumes" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.335185 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bdd8664-6d91-4616-8095-f44067fdca51" path="/var/lib/kubelet/pods/6bdd8664-6d91-4616-8095-f44067fdca51/volumes" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.336399 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c90ad80e-9897-4e20-b9b0-6add43c84bd0" path="/var/lib/kubelet/pods/c90ad80e-9897-4e20-b9b0-6add43c84bd0/volumes" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.337400 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1713962-9458-45b2-9f28-61409b7ff581" path="/var/lib/kubelet/pods/f1713962-9458-45b2-9f28-61409b7ff581/volumes" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.605347 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535870-pjv2c"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.623273 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535870-pjv2c"] Feb 27 00:36:06 crc kubenswrapper[4781]: I0227 00:36:06.043895 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8c9b-account-create-update-d29bm"] Feb 27 00:36:06 crc kubenswrapper[4781]: I0227 00:36:06.055477 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8c9b-account-create-update-d29bm"] Feb 27 00:36:07 crc kubenswrapper[4781]: I0227 00:36:07.309410 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:36:07 crc kubenswrapper[4781]: E0227 00:36:07.310871 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:36:07 crc kubenswrapper[4781]: I0227 00:36:07.322780 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ec74af-d604-42ac-83bb-db047e8d8506" path="/var/lib/kubelet/pods/b8ec74af-d604-42ac-83bb-db047e8d8506/volumes" Feb 27 00:36:07 crc kubenswrapper[4781]: I0227 00:36:07.324392 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb4687ec-812e-48bb-8d53-ed628f3cd013" path="/var/lib/kubelet/pods/bb4687ec-812e-48bb-8d53-ed628f3cd013/volumes" Feb 27 00:36:20 crc kubenswrapper[4781]: I0227 00:36:20.309408 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:36:20 crc kubenswrapper[4781]: E0227 00:36:20.310168 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.433952 4781 scope.go:117] "RemoveContainer" containerID="a52130091c9100982624c568d31dc83849096589647f47661d0debdea301a332" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.467145 4781 scope.go:117] "RemoveContainer" containerID="b5253e8bb3200baca59ed8e598dc74eaddbc9fc4ea687d121523ff8347b4d62e" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.533333 4781 scope.go:117] "RemoveContainer" containerID="2520db6bdce6e0291f097369119b25f716226e74f321fc28345a81a9140017c8" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.612517 4781 scope.go:117] "RemoveContainer" containerID="74853e0dfa3329c0157368e93fb3d1251b7149a8041ea7981936c9bd91076b44" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.665617 4781 scope.go:117] "RemoveContainer" containerID="4d55d2c6e343b6a1d3b8b47dac42837612db67ccce352ab276d326d2b146954e" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.731201 4781 scope.go:117] "RemoveContainer" containerID="8fb72d9409a124bb8fa0479e75bf3cf0cd120b3aae8696f10bef9465f2261fc6" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.810381 4781 scope.go:117] "RemoveContainer" containerID="b01d66bc253f93ef989863fc6fd69c5afb4405a98783d9e32be4f4b80ce3df36" Feb 27 00:36:31 crc kubenswrapper[4781]: I0227 00:36:31.319048 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:36:31 crc kubenswrapper[4781]: E0227 00:36:31.319721 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.050203 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-a05d-account-create-update-cw8zv"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.061310 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4bde-account-create-update-fpg2t"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.073028 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wxsbg"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.085377 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-m5rm5"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.097588 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-99xdp"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.110948 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6e38-account-create-update-dntk2"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.120443 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a8a8-account-create-update-vcwwx"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.129521 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4bde-account-create-update-fpg2t"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.138101 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-a05d-account-create-update-cw8zv"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.147028 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-m5rm5"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.157583 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a8a8-account-create-update-vcwwx"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.166268 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-99xdp"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.176058 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6e38-account-create-update-dntk2"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.184659 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-wxsbg"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.194012 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-zvn4t"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.203086 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-zvn4t"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.211974 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-v2g9n"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.222025 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-v2g9n"] Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.344269 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eb55288-e9bb-46f0-bae3-789e8db036cf" path="/var/lib/kubelet/pods/0eb55288-e9bb-46f0-bae3-789e8db036cf/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.349813 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24adb929-f812-4243-94ea-23345856d28f" path="/var/lib/kubelet/pods/24adb929-f812-4243-94ea-23345856d28f/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.351290 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388be198-b438-4142-8fb8-ec9831e9a1af" path="/var/lib/kubelet/pods/388be198-b438-4142-8fb8-ec9831e9a1af/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.351884 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" path="/var/lib/kubelet/pods/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.353336 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" path="/var/lib/kubelet/pods/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.354825 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6344c1fe-eecb-4d57-a5c7-a857e4466439" path="/var/lib/kubelet/pods/6344c1fe-eecb-4d57-a5c7-a857e4466439/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.356760 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cafd294d-e929-4cd5-8be3-7175ad4aed09" path="/var/lib/kubelet/pods/cafd294d-e929-4cd5-8be3-7175ad4aed09/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.357471 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3aedfe4-2bbb-46c9-97d4-8d6782c44707" path="/var/lib/kubelet/pods/e3aedfe4-2bbb-46c9-97d4-8d6782c44707/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.359457 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8806487-486f-464d-8249-b6368daabff5" path="/var/lib/kubelet/pods/e8806487-486f-464d-8249-b6368daabff5/volumes" Feb 27 00:36:36 crc kubenswrapper[4781]: I0227 00:36:36.032322 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-8tmft"] Feb 27 00:36:36 crc kubenswrapper[4781]: I0227 00:36:36.040941 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-8tmft"] Feb 27 00:36:37 crc kubenswrapper[4781]: I0227 00:36:37.030768 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-rs9bx"] Feb 27 00:36:37 crc kubenswrapper[4781]: I0227 00:36:37.043595 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-rs9bx"] Feb 27 00:36:37 crc kubenswrapper[4781]: I0227 00:36:37.321033 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47cc3f01-6a5c-4797-bf86-25770e66e928" path="/var/lib/kubelet/pods/47cc3f01-6a5c-4797-bf86-25770e66e928/volumes" Feb 27 00:36:37 crc kubenswrapper[4781]: I0227 00:36:37.322565 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58b577a3-c234-4968-a8e7-c5e629de47b1" path="/var/lib/kubelet/pods/58b577a3-c234-4968-a8e7-c5e629de47b1/volumes" Feb 27 00:36:45 crc kubenswrapper[4781]: I0227 00:36:45.310111 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:36:45 crc kubenswrapper[4781]: E0227 00:36:45.310973 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:36:56 crc kubenswrapper[4781]: I0227 00:36:56.309149 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:36:56 crc kubenswrapper[4781]: E0227 00:36:56.309948 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:37:07 crc kubenswrapper[4781]: I0227 00:37:07.727467 4781 generic.go:334] "Generic (PLEG): container finished" podID="756e2fbc-556d-44b8-8820-e469ae73ff3b" containerID="2381181a031f2c79d017f2baa667d5dd32b801106d0766673275724fedbb49d5" exitCode=0 Feb 27 00:37:07 crc kubenswrapper[4781]: I0227 00:37:07.727541 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" event={"ID":"756e2fbc-556d-44b8-8820-e469ae73ff3b","Type":"ContainerDied","Data":"2381181a031f2c79d017f2baa667d5dd32b801106d0766673275724fedbb49d5"} Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.446605 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.580563 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-ssh-key-openstack-edpm-ipam\") pod \"756e2fbc-556d-44b8-8820-e469ae73ff3b\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.580977 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qgqr\" (UniqueName: \"kubernetes.io/projected/756e2fbc-556d-44b8-8820-e469ae73ff3b-kube-api-access-2qgqr\") pod \"756e2fbc-556d-44b8-8820-e469ae73ff3b\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.581175 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-inventory\") pod \"756e2fbc-556d-44b8-8820-e469ae73ff3b\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.586427 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756e2fbc-556d-44b8-8820-e469ae73ff3b-kube-api-access-2qgqr" (OuterVolumeSpecName: "kube-api-access-2qgqr") pod "756e2fbc-556d-44b8-8820-e469ae73ff3b" (UID: "756e2fbc-556d-44b8-8820-e469ae73ff3b"). InnerVolumeSpecName "kube-api-access-2qgqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.618187 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-inventory" (OuterVolumeSpecName: "inventory") pod "756e2fbc-556d-44b8-8820-e469ae73ff3b" (UID: "756e2fbc-556d-44b8-8820-e469ae73ff3b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.620513 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "756e2fbc-556d-44b8-8820-e469ae73ff3b" (UID: "756e2fbc-556d-44b8-8820-e469ae73ff3b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.683689 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qgqr\" (UniqueName: \"kubernetes.io/projected/756e2fbc-556d-44b8-8820-e469ae73ff3b-kube-api-access-2qgqr\") on node \"crc\" DevicePath \"\"" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.683736 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.683751 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.747002 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" event={"ID":"756e2fbc-556d-44b8-8820-e469ae73ff3b","Type":"ContainerDied","Data":"3f5e71502ea441afeaae2337c126ac0cba6512e3c3bd88e9bd95ee0b8fb3c58b"} Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.747044 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f5e71502ea441afeaae2337c126ac0cba6512e3c3bd88e9bd95ee0b8fb3c58b" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.747300 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.839161 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg"] Feb 27 00:37:09 crc kubenswrapper[4781]: E0227 00:37:09.839668 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756e2fbc-556d-44b8-8820-e469ae73ff3b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.839692 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="756e2fbc-556d-44b8-8820-e469ae73ff3b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 27 00:37:09 crc kubenswrapper[4781]: E0227 00:37:09.839725 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9301966-9820-4623-8393-f185a0616743" containerName="oc" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.839733 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9301966-9820-4623-8393-f185a0616743" containerName="oc" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.839964 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="756e2fbc-556d-44b8-8820-e469ae73ff3b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.839982 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9301966-9820-4623-8393-f185a0616743" containerName="oc" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.840860 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.845355 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.845581 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.845845 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.847932 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.875724 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg"] Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.887732 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.888132 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.888286 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpltx\" (UniqueName: \"kubernetes.io/projected/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-kube-api-access-wpltx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.990685 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.990755 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpltx\" (UniqueName: \"kubernetes.io/projected/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-kube-api-access-wpltx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.990907 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.994880 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:10 crc kubenswrapper[4781]: I0227 00:37:10.006232 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:10 crc kubenswrapper[4781]: I0227 00:37:10.007598 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpltx\" (UniqueName: \"kubernetes.io/projected/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-kube-api-access-wpltx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:10 crc kubenswrapper[4781]: I0227 00:37:10.162483 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:10 crc kubenswrapper[4781]: I0227 00:37:10.317536 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:37:10 crc kubenswrapper[4781]: E0227 00:37:10.318078 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:37:10 crc kubenswrapper[4781]: I0227 00:37:10.775612 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg"] Feb 27 00:37:10 crc kubenswrapper[4781]: I0227 00:37:10.779330 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:37:11 crc kubenswrapper[4781]: I0227 00:37:11.767446 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" event={"ID":"95533111-b2e6-41c2-b7b8-edc0a82e2ca5","Type":"ContainerStarted","Data":"d24a3b741e7a4b7b6e83691b9d42820d7bef593ed487e6d9b52037b61a1964eb"} Feb 27 00:37:11 crc kubenswrapper[4781]: I0227 00:37:11.768020 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" event={"ID":"95533111-b2e6-41c2-b7b8-edc0a82e2ca5","Type":"ContainerStarted","Data":"6accd8b64d2459c3e4f34e1caa40c9f80e86200a6e054165b7c6c4d213fc4543"} Feb 27 00:37:11 crc kubenswrapper[4781]: I0227 00:37:11.803829 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" podStartSLOduration=2.241094592 podStartE2EDuration="2.803808496s" podCreationTimestamp="2026-02-27 00:37:09 +0000 UTC" firstStartedPulling="2026-02-27 00:37:10.779143 +0000 UTC m=+1900.036682554" lastFinishedPulling="2026-02-27 00:37:11.341856904 +0000 UTC m=+1900.599396458" observedRunningTime="2026-02-27 00:37:11.78864456 +0000 UTC m=+1901.046184114" watchObservedRunningTime="2026-02-27 00:37:11.803808496 +0000 UTC m=+1901.061348050" Feb 27 00:37:12 crc kubenswrapper[4781]: I0227 00:37:12.042254 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-bk54r"] Feb 27 00:37:12 crc kubenswrapper[4781]: I0227 00:37:12.053292 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-bk54r"] Feb 27 00:37:13 crc kubenswrapper[4781]: I0227 00:37:13.321872 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f43ab5c-f862-468c-92c1-ec7366eb7ed0" path="/var/lib/kubelet/pods/3f43ab5c-f862-468c-92c1-ec7366eb7ed0/volumes" Feb 27 00:37:24 crc kubenswrapper[4781]: I0227 00:37:24.309113 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:37:24 crc kubenswrapper[4781]: E0227 00:37:24.310069 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.016520 4781 scope.go:117] "RemoveContainer" containerID="d19d827d09664d0dd3483609af04ecbb9a2549b9335d9da322a84e9180f2130b" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.060549 4781 scope.go:117] "RemoveContainer" containerID="6d76d1e8767f2bf9f86c0f509bcf89309b39540bcf16a94f15017d9639753143" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.141700 4781 scope.go:117] "RemoveContainer" containerID="e75379ab5c604b926c8da8b4e1bc70d938265b4b81cac412dc92c66988d11e4a" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.187843 4781 scope.go:117] "RemoveContainer" containerID="6743d7b0c9868a62aac9ecae7e0ec57bc1eee6923be88c6054b55ea63c96129c" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.227019 4781 scope.go:117] "RemoveContainer" containerID="08f09b8baf0d256e75e4f2cea8a8050728aa867b805093cf4bae153a92736b36" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.273425 4781 scope.go:117] "RemoveContainer" containerID="9fc8ab8561670a45356ed0c0f51ff964f3556019e4a98628e764c0be8c981d4c" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.322841 4781 scope.go:117] "RemoveContainer" containerID="69da9fba4081d0816d2a2271ca344a6097bd067857fe6ffab787c65da0531cbc" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.373344 4781 scope.go:117] "RemoveContainer" containerID="297b6944b15c3822e081c593733409a3c29b72246756946b04eaf97a2a16c5d2" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.396330 4781 scope.go:117] "RemoveContainer" containerID="f6f1fd0f3e8826d700e5044d1fe1b6b827695311ff2f847e95e5ba49a2863393" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.425592 4781 scope.go:117] "RemoveContainer" containerID="9abe8ef3a48995708f20de72923495db036e6761eb107a6dfc8ea5dccc96bf58" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.462970 4781 scope.go:117] "RemoveContainer" containerID="6dace96637328dc4640d3549a1c802cf99efe23b4ad5c291813668a60dc8b49e" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.484987 4781 scope.go:117] "RemoveContainer" containerID="c1465b73a1df33b94300981b2d1ed1143dd7203d14e97be01d951e1a43d63b4b" Feb 27 00:37:31 crc kubenswrapper[4781]: I0227 00:37:31.050824 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jqsnp"] Feb 27 00:37:31 crc kubenswrapper[4781]: I0227 00:37:31.065573 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jqsnp"] Feb 27 00:37:31 crc kubenswrapper[4781]: I0227 00:37:31.325367 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3fa4251-dd48-417b-8002-6df02d3d3dac" path="/var/lib/kubelet/pods/a3fa4251-dd48-417b-8002-6df02d3d3dac/volumes" Feb 27 00:37:32 crc kubenswrapper[4781]: I0227 00:37:32.039607 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gxj6b"] Feb 27 00:37:32 crc kubenswrapper[4781]: I0227 00:37:32.048551 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-bf4zw"] Feb 27 00:37:32 crc kubenswrapper[4781]: I0227 00:37:32.057520 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-bf4zw"] Feb 27 00:37:32 crc kubenswrapper[4781]: I0227 00:37:32.066789 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gxj6b"] Feb 27 00:37:33 crc kubenswrapper[4781]: I0227 00:37:33.327287 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="314ca901-3264-4136-b377-daad0075b72c" path="/var/lib/kubelet/pods/314ca901-3264-4136-b377-daad0075b72c/volumes" Feb 27 00:37:33 crc kubenswrapper[4781]: I0227 00:37:33.328700 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" path="/var/lib/kubelet/pods/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1/volumes" Feb 27 00:37:35 crc kubenswrapper[4781]: I0227 00:37:35.046876 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9vlp4"] Feb 27 00:37:35 crc kubenswrapper[4781]: I0227 00:37:35.059809 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9vlp4"] Feb 27 00:37:35 crc kubenswrapper[4781]: I0227 00:37:35.321294 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aef65495-ecb2-4396-bb05-a4c5ee48f291" path="/var/lib/kubelet/pods/aef65495-ecb2-4396-bb05-a4c5ee48f291/volumes" Feb 27 00:37:39 crc kubenswrapper[4781]: I0227 00:37:39.309290 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:37:39 crc kubenswrapper[4781]: E0227 00:37:39.310224 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.623651 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7qb2s"] Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.626895 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.634826 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7qb2s"] Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.667645 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfghf\" (UniqueName: \"kubernetes.io/projected/711e5f04-7574-4aae-921b-84beb876849f-kube-api-access-gfghf\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.668041 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-catalog-content\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.668174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-utilities\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.770133 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-catalog-content\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.770171 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-utilities\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.770267 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfghf\" (UniqueName: \"kubernetes.io/projected/711e5f04-7574-4aae-921b-84beb876849f-kube-api-access-gfghf\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.771021 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-catalog-content\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.771038 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-utilities\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.792543 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfghf\" (UniqueName: \"kubernetes.io/projected/711e5f04-7574-4aae-921b-84beb876849f-kube-api-access-gfghf\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.988091 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:45 crc kubenswrapper[4781]: I0227 00:37:45.488880 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7qb2s"] Feb 27 00:37:46 crc kubenswrapper[4781]: I0227 00:37:46.166450 4781 generic.go:334] "Generic (PLEG): container finished" podID="711e5f04-7574-4aae-921b-84beb876849f" containerID="9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884" exitCode=0 Feb 27 00:37:46 crc kubenswrapper[4781]: I0227 00:37:46.166546 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerDied","Data":"9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884"} Feb 27 00:37:46 crc kubenswrapper[4781]: I0227 00:37:46.166891 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerStarted","Data":"14c71355d65ecb6c9f56a4511e8798166b28ba0593bf8c51d3f7d5c3c0a96991"} Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.177582 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerStarted","Data":"c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103"} Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.217973 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q6qt2"] Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.221330 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.233990 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6qt2"] Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.331353 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7crc7\" (UniqueName: \"kubernetes.io/projected/55171d71-37a1-422f-8209-3880be373d30-kube-api-access-7crc7\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.331403 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-utilities\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.331966 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-catalog-content\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.434679 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7crc7\" (UniqueName: \"kubernetes.io/projected/55171d71-37a1-422f-8209-3880be373d30-kube-api-access-7crc7\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.434739 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-utilities\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.434869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-catalog-content\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.435484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-utilities\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.435562 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-catalog-content\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.474367 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7crc7\" (UniqueName: \"kubernetes.io/projected/55171d71-37a1-422f-8209-3880be373d30-kube-api-access-7crc7\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.538428 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:48 crc kubenswrapper[4781]: W0227 00:37:48.034610 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55171d71_37a1_422f_8209_3880be373d30.slice/crio-78514d35d2a0766ea954f03f71c3b3be321ce1ff36f317d754d1cbc3feb05265 WatchSource:0}: Error finding container 78514d35d2a0766ea954f03f71c3b3be321ce1ff36f317d754d1cbc3feb05265: Status 404 returned error can't find the container with id 78514d35d2a0766ea954f03f71c3b3be321ce1ff36f317d754d1cbc3feb05265 Feb 27 00:37:48 crc kubenswrapper[4781]: I0227 00:37:48.044074 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6qt2"] Feb 27 00:37:48 crc kubenswrapper[4781]: I0227 00:37:48.186605 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerStarted","Data":"78514d35d2a0766ea954f03f71c3b3be321ce1ff36f317d754d1cbc3feb05265"} Feb 27 00:37:49 crc kubenswrapper[4781]: I0227 00:37:49.200221 4781 generic.go:334] "Generic (PLEG): container finished" podID="55171d71-37a1-422f-8209-3880be373d30" containerID="8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8" exitCode=0 Feb 27 00:37:49 crc kubenswrapper[4781]: I0227 00:37:49.200304 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerDied","Data":"8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8"} Feb 27 00:37:50 crc kubenswrapper[4781]: I0227 00:37:50.211871 4781 generic.go:334] "Generic (PLEG): container finished" podID="711e5f04-7574-4aae-921b-84beb876849f" containerID="c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103" exitCode=0 Feb 27 00:37:50 crc kubenswrapper[4781]: I0227 00:37:50.211936 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerDied","Data":"c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103"} Feb 27 00:37:50 crc kubenswrapper[4781]: I0227 00:37:50.215699 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerStarted","Data":"3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f"} Feb 27 00:37:51 crc kubenswrapper[4781]: I0227 00:37:51.228008 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerStarted","Data":"dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893"} Feb 27 00:37:51 crc kubenswrapper[4781]: I0227 00:37:51.252118 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7qb2s" podStartSLOduration=2.477681404 podStartE2EDuration="7.252099271s" podCreationTimestamp="2026-02-27 00:37:44 +0000 UTC" firstStartedPulling="2026-02-27 00:37:46.168586913 +0000 UTC m=+1935.426126467" lastFinishedPulling="2026-02-27 00:37:50.94300476 +0000 UTC m=+1940.200544334" observedRunningTime="2026-02-27 00:37:51.248709103 +0000 UTC m=+1940.506248657" watchObservedRunningTime="2026-02-27 00:37:51.252099271 +0000 UTC m=+1940.509638825" Feb 27 00:37:52 crc kubenswrapper[4781]: I0227 00:37:52.239324 4781 generic.go:334] "Generic (PLEG): container finished" podID="55171d71-37a1-422f-8209-3880be373d30" containerID="3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f" exitCode=0 Feb 27 00:37:52 crc kubenswrapper[4781]: I0227 00:37:52.239446 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerDied","Data":"3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f"} Feb 27 00:37:53 crc kubenswrapper[4781]: I0227 00:37:53.254272 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerStarted","Data":"9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6"} Feb 27 00:37:53 crc kubenswrapper[4781]: I0227 00:37:53.282576 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q6qt2" podStartSLOduration=2.826381063 podStartE2EDuration="6.282558969s" podCreationTimestamp="2026-02-27 00:37:47 +0000 UTC" firstStartedPulling="2026-02-27 00:37:49.202181934 +0000 UTC m=+1938.459721488" lastFinishedPulling="2026-02-27 00:37:52.65835984 +0000 UTC m=+1941.915899394" observedRunningTime="2026-02-27 00:37:53.273287197 +0000 UTC m=+1942.530826761" watchObservedRunningTime="2026-02-27 00:37:53.282558969 +0000 UTC m=+1942.540098523" Feb 27 00:37:53 crc kubenswrapper[4781]: I0227 00:37:53.310601 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:37:53 crc kubenswrapper[4781]: E0227 00:37:53.311017 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:37:54 crc kubenswrapper[4781]: I0227 00:37:54.988329 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:54 crc kubenswrapper[4781]: I0227 00:37:54.988683 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:56 crc kubenswrapper[4781]: I0227 00:37:56.037189 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7qb2s" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="registry-server" probeResult="failure" output=< Feb 27 00:37:56 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:37:56 crc kubenswrapper[4781]: > Feb 27 00:37:57 crc kubenswrapper[4781]: I0227 00:37:57.539018 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:57 crc kubenswrapper[4781]: I0227 00:37:57.539077 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:57 crc kubenswrapper[4781]: I0227 00:37:57.589747 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:58 crc kubenswrapper[4781]: I0227 00:37:58.359341 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:58 crc kubenswrapper[4781]: I0227 00:37:58.415817 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6qt2"] Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.147643 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535878-49z87"] Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.149915 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.152548 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.156899 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.156899 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.160611 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535878-49z87"] Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.321840 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4hqj\" (UniqueName: \"kubernetes.io/projected/b63206fe-04b3-4f07-a4cb-f8fd89645931-kube-api-access-m4hqj\") pod \"auto-csr-approver-29535878-49z87\" (UID: \"b63206fe-04b3-4f07-a4cb-f8fd89645931\") " pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.324262 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q6qt2" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="registry-server" containerID="cri-o://9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6" gracePeriod=2 Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.424284 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4hqj\" (UniqueName: \"kubernetes.io/projected/b63206fe-04b3-4f07-a4cb-f8fd89645931-kube-api-access-m4hqj\") pod \"auto-csr-approver-29535878-49z87\" (UID: \"b63206fe-04b3-4f07-a4cb-f8fd89645931\") " pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.450129 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4hqj\" (UniqueName: \"kubernetes.io/projected/b63206fe-04b3-4f07-a4cb-f8fd89645931-kube-api-access-m4hqj\") pod \"auto-csr-approver-29535878-49z87\" (UID: \"b63206fe-04b3-4f07-a4cb-f8fd89645931\") " pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.470363 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.900279 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.987619 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535878-49z87"] Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.037461 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-utilities\") pod \"55171d71-37a1-422f-8209-3880be373d30\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.037511 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7crc7\" (UniqueName: \"kubernetes.io/projected/55171d71-37a1-422f-8209-3880be373d30-kube-api-access-7crc7\") pod \"55171d71-37a1-422f-8209-3880be373d30\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.037576 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-catalog-content\") pod \"55171d71-37a1-422f-8209-3880be373d30\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.038443 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-utilities" (OuterVolumeSpecName: "utilities") pod "55171d71-37a1-422f-8209-3880be373d30" (UID: "55171d71-37a1-422f-8209-3880be373d30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.044310 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55171d71-37a1-422f-8209-3880be373d30-kube-api-access-7crc7" (OuterVolumeSpecName: "kube-api-access-7crc7") pod "55171d71-37a1-422f-8209-3880be373d30" (UID: "55171d71-37a1-422f-8209-3880be373d30"). InnerVolumeSpecName "kube-api-access-7crc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.062697 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55171d71-37a1-422f-8209-3880be373d30" (UID: "55171d71-37a1-422f-8209-3880be373d30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.140064 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.140100 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7crc7\" (UniqueName: \"kubernetes.io/projected/55171d71-37a1-422f-8209-3880be373d30-kube-api-access-7crc7\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.140112 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.336056 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535878-49z87" event={"ID":"b63206fe-04b3-4f07-a4cb-f8fd89645931","Type":"ContainerStarted","Data":"984293dfb6eabf90a79acc02e28249f4b01ff5cb7665ca85189795184da119f5"} Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.339805 4781 generic.go:334] "Generic (PLEG): container finished" podID="55171d71-37a1-422f-8209-3880be373d30" containerID="9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6" exitCode=0 Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.339861 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.340028 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerDied","Data":"9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6"} Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.340596 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerDied","Data":"78514d35d2a0766ea954f03f71c3b3be321ce1ff36f317d754d1cbc3feb05265"} Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.340732 4781 scope.go:117] "RemoveContainer" containerID="9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.375512 4781 scope.go:117] "RemoveContainer" containerID="3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.376597 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6qt2"] Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.389910 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6qt2"] Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.397093 4781 scope.go:117] "RemoveContainer" containerID="8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.422199 4781 scope.go:117] "RemoveContainer" containerID="9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6" Feb 27 00:38:01 crc kubenswrapper[4781]: E0227 00:38:01.422690 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6\": container with ID starting with 9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6 not found: ID does not exist" containerID="9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.422735 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6"} err="failed to get container status \"9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6\": rpc error: code = NotFound desc = could not find container \"9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6\": container with ID starting with 9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6 not found: ID does not exist" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.422764 4781 scope.go:117] "RemoveContainer" containerID="3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f" Feb 27 00:38:01 crc kubenswrapper[4781]: E0227 00:38:01.423054 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f\": container with ID starting with 3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f not found: ID does not exist" containerID="3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.423142 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f"} err="failed to get container status \"3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f\": rpc error: code = NotFound desc = could not find container \"3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f\": container with ID starting with 3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f not found: ID does not exist" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.423163 4781 scope.go:117] "RemoveContainer" containerID="8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8" Feb 27 00:38:01 crc kubenswrapper[4781]: E0227 00:38:01.424072 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8\": container with ID starting with 8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8 not found: ID does not exist" containerID="8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.424130 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8"} err="failed to get container status \"8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8\": rpc error: code = NotFound desc = could not find container \"8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8\": container with ID starting with 8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8 not found: ID does not exist" Feb 27 00:38:02 crc kubenswrapper[4781]: I0227 00:38:02.353428 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535878-49z87" event={"ID":"b63206fe-04b3-4f07-a4cb-f8fd89645931","Type":"ContainerStarted","Data":"e098a22e98e83ab04db629aad7e6384885fe2b771dad33544e78c6562872ae4e"} Feb 27 00:38:02 crc kubenswrapper[4781]: I0227 00:38:02.369423 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535878-49z87" podStartSLOduration=1.48741801 podStartE2EDuration="2.36939902s" podCreationTimestamp="2026-02-27 00:38:00 +0000 UTC" firstStartedPulling="2026-02-27 00:38:00.997419666 +0000 UTC m=+1950.254959210" lastFinishedPulling="2026-02-27 00:38:01.879400666 +0000 UTC m=+1951.136940220" observedRunningTime="2026-02-27 00:38:02.367236613 +0000 UTC m=+1951.624776167" watchObservedRunningTime="2026-02-27 00:38:02.36939902 +0000 UTC m=+1951.626938584" Feb 27 00:38:03 crc kubenswrapper[4781]: I0227 00:38:03.323970 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55171d71-37a1-422f-8209-3880be373d30" path="/var/lib/kubelet/pods/55171d71-37a1-422f-8209-3880be373d30/volumes" Feb 27 00:38:03 crc kubenswrapper[4781]: I0227 00:38:03.365881 4781 generic.go:334] "Generic (PLEG): container finished" podID="b63206fe-04b3-4f07-a4cb-f8fd89645931" containerID="e098a22e98e83ab04db629aad7e6384885fe2b771dad33544e78c6562872ae4e" exitCode=0 Feb 27 00:38:03 crc kubenswrapper[4781]: I0227 00:38:03.365940 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535878-49z87" event={"ID":"b63206fe-04b3-4f07-a4cb-f8fd89645931","Type":"ContainerDied","Data":"e098a22e98e83ab04db629aad7e6384885fe2b771dad33544e78c6562872ae4e"} Feb 27 00:38:04 crc kubenswrapper[4781]: I0227 00:38:04.309977 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:38:04 crc kubenswrapper[4781]: E0227 00:38:04.310276 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:38:04 crc kubenswrapper[4781]: I0227 00:38:04.832775 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:04 crc kubenswrapper[4781]: I0227 00:38:04.921898 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4hqj\" (UniqueName: \"kubernetes.io/projected/b63206fe-04b3-4f07-a4cb-f8fd89645931-kube-api-access-m4hqj\") pod \"b63206fe-04b3-4f07-a4cb-f8fd89645931\" (UID: \"b63206fe-04b3-4f07-a4cb-f8fd89645931\") " Feb 27 00:38:04 crc kubenswrapper[4781]: I0227 00:38:04.928334 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b63206fe-04b3-4f07-a4cb-f8fd89645931-kube-api-access-m4hqj" (OuterVolumeSpecName: "kube-api-access-m4hqj") pod "b63206fe-04b3-4f07-a4cb-f8fd89645931" (UID: "b63206fe-04b3-4f07-a4cb-f8fd89645931"). InnerVolumeSpecName "kube-api-access-m4hqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.024462 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4hqj\" (UniqueName: \"kubernetes.io/projected/b63206fe-04b3-4f07-a4cb-f8fd89645931-kube-api-access-m4hqj\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.034309 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.080249 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.268276 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7qb2s"] Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.388941 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535878-49z87" event={"ID":"b63206fe-04b3-4f07-a4cb-f8fd89645931","Type":"ContainerDied","Data":"984293dfb6eabf90a79acc02e28249f4b01ff5cb7665ca85189795184da119f5"} Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.389465 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="984293dfb6eabf90a79acc02e28249f4b01ff5cb7665ca85189795184da119f5" Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.388972 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.445933 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535872-fpvhr"] Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.455216 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535872-fpvhr"] Feb 27 00:38:06 crc kubenswrapper[4781]: I0227 00:38:06.397576 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7qb2s" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="registry-server" containerID="cri-o://dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893" gracePeriod=2 Feb 27 00:38:06 crc kubenswrapper[4781]: I0227 00:38:06.948991 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.066475 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfghf\" (UniqueName: \"kubernetes.io/projected/711e5f04-7574-4aae-921b-84beb876849f-kube-api-access-gfghf\") pod \"711e5f04-7574-4aae-921b-84beb876849f\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.066570 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-utilities\") pod \"711e5f04-7574-4aae-921b-84beb876849f\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.066723 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-catalog-content\") pod \"711e5f04-7574-4aae-921b-84beb876849f\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.068312 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-utilities" (OuterVolumeSpecName: "utilities") pod "711e5f04-7574-4aae-921b-84beb876849f" (UID: "711e5f04-7574-4aae-921b-84beb876849f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.090839 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/711e5f04-7574-4aae-921b-84beb876849f-kube-api-access-gfghf" (OuterVolumeSpecName: "kube-api-access-gfghf") pod "711e5f04-7574-4aae-921b-84beb876849f" (UID: "711e5f04-7574-4aae-921b-84beb876849f"). InnerVolumeSpecName "kube-api-access-gfghf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.136821 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "711e5f04-7574-4aae-921b-84beb876849f" (UID: "711e5f04-7574-4aae-921b-84beb876849f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.169030 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfghf\" (UniqueName: \"kubernetes.io/projected/711e5f04-7574-4aae-921b-84beb876849f-kube-api-access-gfghf\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.169057 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.169090 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.321479 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ad6440-a4bb-43a6-985a-42979a799437" path="/var/lib/kubelet/pods/28ad6440-a4bb-43a6-985a-42979a799437/volumes" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.409416 4781 generic.go:334] "Generic (PLEG): container finished" podID="711e5f04-7574-4aae-921b-84beb876849f" containerID="dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893" exitCode=0 Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.409471 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerDied","Data":"dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893"} Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.409502 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerDied","Data":"14c71355d65ecb6c9f56a4511e8798166b28ba0593bf8c51d3f7d5c3c0a96991"} Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.409522 4781 scope.go:117] "RemoveContainer" containerID="dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.410173 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.436973 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7qb2s"] Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.438349 4781 scope.go:117] "RemoveContainer" containerID="c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.447128 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7qb2s"] Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.460454 4781 scope.go:117] "RemoveContainer" containerID="9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.515509 4781 scope.go:117] "RemoveContainer" containerID="dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893" Feb 27 00:38:07 crc kubenswrapper[4781]: E0227 00:38:07.515993 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893\": container with ID starting with dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893 not found: ID does not exist" containerID="dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.516060 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893"} err="failed to get container status \"dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893\": rpc error: code = NotFound desc = could not find container \"dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893\": container with ID starting with dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893 not found: ID does not exist" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.516090 4781 scope.go:117] "RemoveContainer" containerID="c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103" Feb 27 00:38:07 crc kubenswrapper[4781]: E0227 00:38:07.516441 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103\": container with ID starting with c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103 not found: ID does not exist" containerID="c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.516464 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103"} err="failed to get container status \"c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103\": rpc error: code = NotFound desc = could not find container \"c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103\": container with ID starting with c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103 not found: ID does not exist" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.516479 4781 scope.go:117] "RemoveContainer" containerID="9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884" Feb 27 00:38:07 crc kubenswrapper[4781]: E0227 00:38:07.516810 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884\": container with ID starting with 9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884 not found: ID does not exist" containerID="9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.516848 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884"} err="failed to get container status \"9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884\": rpc error: code = NotFound desc = could not find container \"9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884\": container with ID starting with 9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884 not found: ID does not exist" Feb 27 00:38:09 crc kubenswrapper[4781]: I0227 00:38:09.322553 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="711e5f04-7574-4aae-921b-84beb876849f" path="/var/lib/kubelet/pods/711e5f04-7574-4aae-921b-84beb876849f/volumes" Feb 27 00:38:17 crc kubenswrapper[4781]: I0227 00:38:17.519336 4781 generic.go:334] "Generic (PLEG): container finished" podID="95533111-b2e6-41c2-b7b8-edc0a82e2ca5" containerID="d24a3b741e7a4b7b6e83691b9d42820d7bef593ed487e6d9b52037b61a1964eb" exitCode=0 Feb 27 00:38:17 crc kubenswrapper[4781]: I0227 00:38:17.519440 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" event={"ID":"95533111-b2e6-41c2-b7b8-edc0a82e2ca5","Type":"ContainerDied","Data":"d24a3b741e7a4b7b6e83691b9d42820d7bef593ed487e6d9b52037b61a1964eb"} Feb 27 00:38:18 crc kubenswrapper[4781]: I0227 00:38:18.312017 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:38:18 crc kubenswrapper[4781]: E0227 00:38:18.312704 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.096307 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.225874 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpltx\" (UniqueName: \"kubernetes.io/projected/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-kube-api-access-wpltx\") pod \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.226019 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-ssh-key-openstack-edpm-ipam\") pod \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.226213 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-inventory\") pod \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.254817 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-kube-api-access-wpltx" (OuterVolumeSpecName: "kube-api-access-wpltx") pod "95533111-b2e6-41c2-b7b8-edc0a82e2ca5" (UID: "95533111-b2e6-41c2-b7b8-edc0a82e2ca5"). InnerVolumeSpecName "kube-api-access-wpltx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.257119 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-inventory" (OuterVolumeSpecName: "inventory") pod "95533111-b2e6-41c2-b7b8-edc0a82e2ca5" (UID: "95533111-b2e6-41c2-b7b8-edc0a82e2ca5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.264675 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "95533111-b2e6-41c2-b7b8-edc0a82e2ca5" (UID: "95533111-b2e6-41c2-b7b8-edc0a82e2ca5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.328891 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpltx\" (UniqueName: \"kubernetes.io/projected/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-kube-api-access-wpltx\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.328924 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.328936 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.539380 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" event={"ID":"95533111-b2e6-41c2-b7b8-edc0a82e2ca5","Type":"ContainerDied","Data":"6accd8b64d2459c3e4f34e1caa40c9f80e86200a6e054165b7c6c4d213fc4543"} Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.539418 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6accd8b64d2459c3e4f34e1caa40c9f80e86200a6e054165b7c6c4d213fc4543" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.539421 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.617456 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j"] Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.617994 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63206fe-04b3-4f07-a4cb-f8fd89645931" containerName="oc" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618018 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63206fe-04b3-4f07-a4cb-f8fd89645931" containerName="oc" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618039 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95533111-b2e6-41c2-b7b8-edc0a82e2ca5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618049 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="95533111-b2e6-41c2-b7b8-edc0a82e2ca5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618072 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="extract-utilities" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618081 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="extract-utilities" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618096 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="registry-server" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618103 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="registry-server" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618116 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="extract-content" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618124 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="extract-content" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618133 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="registry-server" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618140 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="registry-server" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618157 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="extract-utilities" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618164 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="extract-utilities" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618188 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="extract-content" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618197 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="extract-content" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618423 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63206fe-04b3-4f07-a4cb-f8fd89645931" containerName="oc" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618439 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="registry-server" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618452 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="registry-server" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618491 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="95533111-b2e6-41c2-b7b8-edc0a82e2ca5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.619457 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.628490 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.628895 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.629555 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.629759 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.634875 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.634995 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.635094 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv52b\" (UniqueName: \"kubernetes.io/projected/9f7ced88-662a-42f0-8385-97292a7f4ce4-kube-api-access-xv52b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.647447 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j"] Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.736471 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.736547 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.736614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv52b\" (UniqueName: \"kubernetes.io/projected/9f7ced88-662a-42f0-8385-97292a7f4ce4-kube-api-access-xv52b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.740559 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.740729 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.757502 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv52b\" (UniqueName: \"kubernetes.io/projected/9f7ced88-662a-42f0-8385-97292a7f4ce4-kube-api-access-xv52b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.939079 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:20 crc kubenswrapper[4781]: I0227 00:38:20.482834 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j"] Feb 27 00:38:20 crc kubenswrapper[4781]: I0227 00:38:20.550439 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" event={"ID":"9f7ced88-662a-42f0-8385-97292a7f4ce4","Type":"ContainerStarted","Data":"8440615a76d5270c1652e37a051e61f9bca649ede1374e62c9bf67b4732ac080"} Feb 27 00:38:22 crc kubenswrapper[4781]: I0227 00:38:22.568167 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" event={"ID":"9f7ced88-662a-42f0-8385-97292a7f4ce4","Type":"ContainerStarted","Data":"f63120de9863b31e6b1e80d8f68fb4bd43f35e4812c5407414823adca9d621df"} Feb 27 00:38:22 crc kubenswrapper[4781]: I0227 00:38:22.591438 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" podStartSLOduration=2.3243168020000002 podStartE2EDuration="3.591418618s" podCreationTimestamp="2026-02-27 00:38:19 +0000 UTC" firstStartedPulling="2026-02-27 00:38:20.488252971 +0000 UTC m=+1969.745792525" lastFinishedPulling="2026-02-27 00:38:21.755354787 +0000 UTC m=+1971.012894341" observedRunningTime="2026-02-27 00:38:22.587420244 +0000 UTC m=+1971.844959798" watchObservedRunningTime="2026-02-27 00:38:22.591418618 +0000 UTC m=+1971.848958172" Feb 27 00:38:26 crc kubenswrapper[4781]: I0227 00:38:26.605914 4781 generic.go:334] "Generic (PLEG): container finished" podID="9f7ced88-662a-42f0-8385-97292a7f4ce4" containerID="f63120de9863b31e6b1e80d8f68fb4bd43f35e4812c5407414823adca9d621df" exitCode=0 Feb 27 00:38:26 crc kubenswrapper[4781]: I0227 00:38:26.607477 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" event={"ID":"9f7ced88-662a-42f0-8385-97292a7f4ce4","Type":"ContainerDied","Data":"f63120de9863b31e6b1e80d8f68fb4bd43f35e4812c5407414823adca9d621df"} Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.162909 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.273316 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-ssh-key-openstack-edpm-ipam\") pod \"9f7ced88-662a-42f0-8385-97292a7f4ce4\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.273437 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv52b\" (UniqueName: \"kubernetes.io/projected/9f7ced88-662a-42f0-8385-97292a7f4ce4-kube-api-access-xv52b\") pod \"9f7ced88-662a-42f0-8385-97292a7f4ce4\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.273767 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-inventory\") pod \"9f7ced88-662a-42f0-8385-97292a7f4ce4\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.279801 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f7ced88-662a-42f0-8385-97292a7f4ce4-kube-api-access-xv52b" (OuterVolumeSpecName: "kube-api-access-xv52b") pod "9f7ced88-662a-42f0-8385-97292a7f4ce4" (UID: "9f7ced88-662a-42f0-8385-97292a7f4ce4"). InnerVolumeSpecName "kube-api-access-xv52b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.305489 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9f7ced88-662a-42f0-8385-97292a7f4ce4" (UID: "9f7ced88-662a-42f0-8385-97292a7f4ce4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.307819 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-inventory" (OuterVolumeSpecName: "inventory") pod "9f7ced88-662a-42f0-8385-97292a7f4ce4" (UID: "9f7ced88-662a-42f0-8385-97292a7f4ce4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.376362 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.376402 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.376414 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv52b\" (UniqueName: \"kubernetes.io/projected/9f7ced88-662a-42f0-8385-97292a7f4ce4-kube-api-access-xv52b\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.652998 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" event={"ID":"9f7ced88-662a-42f0-8385-97292a7f4ce4","Type":"ContainerDied","Data":"8440615a76d5270c1652e37a051e61f9bca649ede1374e62c9bf67b4732ac080"} Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.653063 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8440615a76d5270c1652e37a051e61f9bca649ede1374e62c9bf67b4732ac080" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.653089 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.823851 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj"] Feb 27 00:38:28 crc kubenswrapper[4781]: E0227 00:38:28.824496 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7ced88-662a-42f0-8385-97292a7f4ce4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.824514 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7ced88-662a-42f0-8385-97292a7f4ce4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.824728 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f7ced88-662a-42f0-8385-97292a7f4ce4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.825470 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.827564 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.827572 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.828192 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.831828 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.844907 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj"] Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.888267 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.888616 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.888864 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh2pn\" (UniqueName: \"kubernetes.io/projected/29e8157f-b610-48f3-93ac-9173fa6d484a-kube-api-access-gh2pn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.990854 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.990917 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh2pn\" (UniqueName: \"kubernetes.io/projected/29e8157f-b610-48f3-93ac-9173fa6d484a-kube-api-access-gh2pn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.991036 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.997484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.002517 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.008947 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh2pn\" (UniqueName: \"kubernetes.io/projected/29e8157f-b610-48f3-93ac-9173fa6d484a-kube-api-access-gh2pn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.143514 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.725239 4781 scope.go:117] "RemoveContainer" containerID="d4ee7796e64f1964f0ab74414c33a59e4f95e98e4eb4a260e730590563ac50fe" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.729549 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj"] Feb 27 00:38:29 crc kubenswrapper[4781]: W0227 00:38:29.735656 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29e8157f_b610_48f3_93ac_9173fa6d484a.slice/crio-76301ff13c2e5a35dd505b8b42a308f0dea15e75e138c72c3fd0670cba71e23e WatchSource:0}: Error finding container 76301ff13c2e5a35dd505b8b42a308f0dea15e75e138c72c3fd0670cba71e23e: Status 404 returned error can't find the container with id 76301ff13c2e5a35dd505b8b42a308f0dea15e75e138c72c3fd0670cba71e23e Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.825376 4781 scope.go:117] "RemoveContainer" containerID="7d9a07674537261cb97d86282370b22b357712af922b31aea2a8cfe67e8a0a4c" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.872505 4781 scope.go:117] "RemoveContainer" containerID="90d3da646bb32391ad6c504fecd5db68f89221b28accf451c40b52dc228b7d89" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.921894 4781 scope.go:117] "RemoveContainer" containerID="89638f7647330ea3c5230d3d253e70beeda178adf35863cd73f9bfed5a1f6c4c" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.947068 4781 scope.go:117] "RemoveContainer" containerID="3dc1eb7dbdd6694e7292463c3972ed88e476b4fd179d083eaeff0cf57f961958" Feb 27 00:38:30 crc kubenswrapper[4781]: I0227 00:38:30.310012 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:38:30 crc kubenswrapper[4781]: E0227 00:38:30.310481 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:38:30 crc kubenswrapper[4781]: I0227 00:38:30.672235 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" event={"ID":"29e8157f-b610-48f3-93ac-9173fa6d484a","Type":"ContainerStarted","Data":"707f4b210ebda3b76fb1a923983ddc8d3406d8cc5d5249610b9d6a6d1ce7e10b"} Feb 27 00:38:30 crc kubenswrapper[4781]: I0227 00:38:30.672314 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" event={"ID":"29e8157f-b610-48f3-93ac-9173fa6d484a","Type":"ContainerStarted","Data":"76301ff13c2e5a35dd505b8b42a308f0dea15e75e138c72c3fd0670cba71e23e"} Feb 27 00:38:30 crc kubenswrapper[4781]: I0227 00:38:30.691796 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" podStartSLOduration=2.261710401 podStartE2EDuration="2.691779311s" podCreationTimestamp="2026-02-27 00:38:28 +0000 UTC" firstStartedPulling="2026-02-27 00:38:29.737397951 +0000 UTC m=+1978.994937495" lastFinishedPulling="2026-02-27 00:38:30.167466851 +0000 UTC m=+1979.425006405" observedRunningTime="2026-02-27 00:38:30.685501947 +0000 UTC m=+1979.943041501" watchObservedRunningTime="2026-02-27 00:38:30.691779311 +0000 UTC m=+1979.949318865" Feb 27 00:38:34 crc kubenswrapper[4781]: I0227 00:38:34.041358 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-kcmlj"] Feb 27 00:38:34 crc kubenswrapper[4781]: I0227 00:38:34.052506 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-kcmlj"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.041033 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-lgv6f"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.057128 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-lgv6f"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.074143 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-141e-account-create-update-msmcr"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.083136 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-9245-account-create-update-j6hsh"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.091554 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cd3e-account-create-update-dkxt7"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.103101 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cd3e-account-create-update-dkxt7"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.114438 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-qx8nd"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.124363 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-9245-account-create-update-j6hsh"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.132425 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-141e-account-create-update-msmcr"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.140841 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-qx8nd"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.356864 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" path="/var/lib/kubelet/pods/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0/volumes" Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.364026 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" path="/var/lib/kubelet/pods/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce/volumes" Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.365099 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6795d880-5f00-4be4-9c67-6f8a251550cb" path="/var/lib/kubelet/pods/6795d880-5f00-4be4-9c67-6f8a251550cb/volumes" Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.365836 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7468389a-cc9b-404c-9414-4d81f3b1a7e5" path="/var/lib/kubelet/pods/7468389a-cc9b-404c-9414-4d81f3b1a7e5/volumes" Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.366468 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f0e335a-e4a1-48ee-b470-a6277acc5dae" path="/var/lib/kubelet/pods/7f0e335a-e4a1-48ee-b470-a6277acc5dae/volumes" Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.384660 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" path="/var/lib/kubelet/pods/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5/volumes" Feb 27 00:38:45 crc kubenswrapper[4781]: I0227 00:38:45.310174 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:38:45 crc kubenswrapper[4781]: E0227 00:38:45.310913 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:38:57 crc kubenswrapper[4781]: I0227 00:38:57.310444 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:38:57 crc kubenswrapper[4781]: E0227 00:38:57.311801 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:39:01 crc kubenswrapper[4781]: I0227 00:39:01.942503 4781 generic.go:334] "Generic (PLEG): container finished" podID="29e8157f-b610-48f3-93ac-9173fa6d484a" containerID="707f4b210ebda3b76fb1a923983ddc8d3406d8cc5d5249610b9d6a6d1ce7e10b" exitCode=0 Feb 27 00:39:01 crc kubenswrapper[4781]: I0227 00:39:01.942596 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" event={"ID":"29e8157f-b610-48f3-93ac-9173fa6d484a","Type":"ContainerDied","Data":"707f4b210ebda3b76fb1a923983ddc8d3406d8cc5d5249610b9d6a6d1ce7e10b"} Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.043503 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9cntr"] Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.058657 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9cntr"] Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.327041 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71a5c1e-7953-4acf-813a-0d96d4992d1f" path="/var/lib/kubelet/pods/d71a5c1e-7953-4acf-813a-0d96d4992d1f/volumes" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.500057 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.643922 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh2pn\" (UniqueName: \"kubernetes.io/projected/29e8157f-b610-48f3-93ac-9173fa6d484a-kube-api-access-gh2pn\") pod \"29e8157f-b610-48f3-93ac-9173fa6d484a\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.644054 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-inventory\") pod \"29e8157f-b610-48f3-93ac-9173fa6d484a\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.644090 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-ssh-key-openstack-edpm-ipam\") pod \"29e8157f-b610-48f3-93ac-9173fa6d484a\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.650858 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e8157f-b610-48f3-93ac-9173fa6d484a-kube-api-access-gh2pn" (OuterVolumeSpecName: "kube-api-access-gh2pn") pod "29e8157f-b610-48f3-93ac-9173fa6d484a" (UID: "29e8157f-b610-48f3-93ac-9173fa6d484a"). InnerVolumeSpecName "kube-api-access-gh2pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.687790 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "29e8157f-b610-48f3-93ac-9173fa6d484a" (UID: "29e8157f-b610-48f3-93ac-9173fa6d484a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.692278 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-inventory" (OuterVolumeSpecName: "inventory") pod "29e8157f-b610-48f3-93ac-9173fa6d484a" (UID: "29e8157f-b610-48f3-93ac-9173fa6d484a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.746620 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh2pn\" (UniqueName: \"kubernetes.io/projected/29e8157f-b610-48f3-93ac-9173fa6d484a-kube-api-access-gh2pn\") on node \"crc\" DevicePath \"\"" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.746671 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.746685 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.964283 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" event={"ID":"29e8157f-b610-48f3-93ac-9173fa6d484a","Type":"ContainerDied","Data":"76301ff13c2e5a35dd505b8b42a308f0dea15e75e138c72c3fd0670cba71e23e"} Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.964699 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76301ff13c2e5a35dd505b8b42a308f0dea15e75e138c72c3fd0670cba71e23e" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.964763 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.094600 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5"] Feb 27 00:39:04 crc kubenswrapper[4781]: E0227 00:39:04.095102 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e8157f-b610-48f3-93ac-9173fa6d484a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.095116 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e8157f-b610-48f3-93ac-9173fa6d484a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.095330 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e8157f-b610-48f3-93ac-9173fa6d484a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.096170 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.101183 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.101278 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.101928 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.102973 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.115552 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5"] Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.153793 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npqcw\" (UniqueName: \"kubernetes.io/projected/b05a1d9c-7887-4173-99fe-97f7c89cc555-kube-api-access-npqcw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.153860 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.154096 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.256063 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.256186 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npqcw\" (UniqueName: \"kubernetes.io/projected/b05a1d9c-7887-4173-99fe-97f7c89cc555-kube-api-access-npqcw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.256218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.268549 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.269134 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.273875 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npqcw\" (UniqueName: \"kubernetes.io/projected/b05a1d9c-7887-4173-99fe-97f7c89cc555-kube-api-access-npqcw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.415599 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.958327 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5"] Feb 27 00:39:04 crc kubenswrapper[4781]: W0227 00:39:04.960734 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb05a1d9c_7887_4173_99fe_97f7c89cc555.slice/crio-22e6fedd890698c3f2870ebe4935123b347d59707f469cb36c3175709bf38141 WatchSource:0}: Error finding container 22e6fedd890698c3f2870ebe4935123b347d59707f469cb36c3175709bf38141: Status 404 returned error can't find the container with id 22e6fedd890698c3f2870ebe4935123b347d59707f469cb36c3175709bf38141 Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.980286 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" event={"ID":"b05a1d9c-7887-4173-99fe-97f7c89cc555","Type":"ContainerStarted","Data":"22e6fedd890698c3f2870ebe4935123b347d59707f469cb36c3175709bf38141"} Feb 27 00:39:05 crc kubenswrapper[4781]: I0227 00:39:05.989960 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" event={"ID":"b05a1d9c-7887-4173-99fe-97f7c89cc555","Type":"ContainerStarted","Data":"fdbd60c17b361428ab3bb4e0269dbd498da5588801dd4a7ab30556bebd16a455"} Feb 27 00:39:06 crc kubenswrapper[4781]: I0227 00:39:06.011876 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" podStartSLOduration=1.573851784 podStartE2EDuration="2.011857493s" podCreationTimestamp="2026-02-27 00:39:04 +0000 UTC" firstStartedPulling="2026-02-27 00:39:04.964868684 +0000 UTC m=+2014.222408228" lastFinishedPulling="2026-02-27 00:39:05.402874383 +0000 UTC m=+2014.660413937" observedRunningTime="2026-02-27 00:39:06.004565672 +0000 UTC m=+2015.262105256" watchObservedRunningTime="2026-02-27 00:39:06.011857493 +0000 UTC m=+2015.269397057" Feb 27 00:39:12 crc kubenswrapper[4781]: I0227 00:39:12.310077 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:39:12 crc kubenswrapper[4781]: E0227 00:39:12.310902 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:39:22 crc kubenswrapper[4781]: I0227 00:39:22.046282 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-qjkwv"] Feb 27 00:39:22 crc kubenswrapper[4781]: I0227 00:39:22.055873 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-qjkwv"] Feb 27 00:39:23 crc kubenswrapper[4781]: I0227 00:39:23.329732 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd521dc6-4126-4c51-8634-66db8ba1412e" path="/var/lib/kubelet/pods/cd521dc6-4126-4c51-8634-66db8ba1412e/volumes" Feb 27 00:39:26 crc kubenswrapper[4781]: I0227 00:39:26.309960 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:39:26 crc kubenswrapper[4781]: E0227 00:39:26.310227 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:39:29 crc kubenswrapper[4781]: I0227 00:39:29.045583 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tg9k8"] Feb 27 00:39:29 crc kubenswrapper[4781]: I0227 00:39:29.059948 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tg9k8"] Feb 27 00:39:29 crc kubenswrapper[4781]: I0227 00:39:29.321496 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b607db2c-2aa3-48f0-9cd8-c5461797431c" path="/var/lib/kubelet/pods/b607db2c-2aa3-48f0-9cd8-c5461797431c/volumes" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.093838 4781 scope.go:117] "RemoveContainer" containerID="39276ac01bb5ee770105ba2bf75f8d61d8081e22c89cdaa97c9f7ed7f2722110" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.168854 4781 scope.go:117] "RemoveContainer" containerID="a4bad047d90bd3b11bea212cddee0782007013387656451beeca5b44aee50150" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.236552 4781 scope.go:117] "RemoveContainer" containerID="12e5844f351b3d039dc82ba98df27afa29e4eaea9f5b2ec45b3c8cb5d018e0ca" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.261477 4781 scope.go:117] "RemoveContainer" containerID="ae3d06d551b95e82732253f74b171a292fd2201889c2e3a5a620c3b16fb394dd" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.312105 4781 scope.go:117] "RemoveContainer" containerID="c9388f02af5b31dc8f5e8ea62ee66fb19cbab695e94e5d03ed46c036e292ce69" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.374718 4781 scope.go:117] "RemoveContainer" containerID="a7acf67e842e66e4a577e00cfd7561f83ca973cea54d959ed8fb7c9427da2a89" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.404017 4781 scope.go:117] "RemoveContainer" containerID="feae0a2cae038402fdacbd138e93b4a28e83ea37dfdf069227fa89f2c8eea228" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.423173 4781 scope.go:117] "RemoveContainer" containerID="24536e1e89dfec02307e517e9566052e3516ec64369f8d65d2939b8e4650f889" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.443132 4781 scope.go:117] "RemoveContainer" containerID="e064657ef0c106a3592f283bb81ae42d2444dda1caced8f721f45cdcfe863108" Feb 27 00:39:37 crc kubenswrapper[4781]: I0227 00:39:37.309980 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:39:37 crc kubenswrapper[4781]: E0227 00:39:37.310701 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:39:49 crc kubenswrapper[4781]: I0227 00:39:49.463831 4781 generic.go:334] "Generic (PLEG): container finished" podID="b05a1d9c-7887-4173-99fe-97f7c89cc555" containerID="fdbd60c17b361428ab3bb4e0269dbd498da5588801dd4a7ab30556bebd16a455" exitCode=0 Feb 27 00:39:49 crc kubenswrapper[4781]: I0227 00:39:49.464058 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" event={"ID":"b05a1d9c-7887-4173-99fe-97f7c89cc555","Type":"ContainerDied","Data":"fdbd60c17b361428ab3bb4e0269dbd498da5588801dd4a7ab30556bebd16a455"} Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.005884 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.104145 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npqcw\" (UniqueName: \"kubernetes.io/projected/b05a1d9c-7887-4173-99fe-97f7c89cc555-kube-api-access-npqcw\") pod \"b05a1d9c-7887-4173-99fe-97f7c89cc555\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.104197 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-ssh-key-openstack-edpm-ipam\") pod \"b05a1d9c-7887-4173-99fe-97f7c89cc555\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.104314 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-inventory\") pod \"b05a1d9c-7887-4173-99fe-97f7c89cc555\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.110868 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b05a1d9c-7887-4173-99fe-97f7c89cc555-kube-api-access-npqcw" (OuterVolumeSpecName: "kube-api-access-npqcw") pod "b05a1d9c-7887-4173-99fe-97f7c89cc555" (UID: "b05a1d9c-7887-4173-99fe-97f7c89cc555"). InnerVolumeSpecName "kube-api-access-npqcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.135346 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b05a1d9c-7887-4173-99fe-97f7c89cc555" (UID: "b05a1d9c-7887-4173-99fe-97f7c89cc555"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.147802 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-inventory" (OuterVolumeSpecName: "inventory") pod "b05a1d9c-7887-4173-99fe-97f7c89cc555" (UID: "b05a1d9c-7887-4173-99fe-97f7c89cc555"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.206223 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npqcw\" (UniqueName: \"kubernetes.io/projected/b05a1d9c-7887-4173-99fe-97f7c89cc555-kube-api-access-npqcw\") on node \"crc\" DevicePath \"\"" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.206260 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.206270 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.481868 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" event={"ID":"b05a1d9c-7887-4173-99fe-97f7c89cc555","Type":"ContainerDied","Data":"22e6fedd890698c3f2870ebe4935123b347d59707f469cb36c3175709bf38141"} Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.481910 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22e6fedd890698c3f2870ebe4935123b347d59707f469cb36c3175709bf38141" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.481963 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.584557 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vvrmt"] Feb 27 00:39:51 crc kubenswrapper[4781]: E0227 00:39:51.584982 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b05a1d9c-7887-4173-99fe-97f7c89cc555" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.585031 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b05a1d9c-7887-4173-99fe-97f7c89cc555" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.585247 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b05a1d9c-7887-4173-99fe-97f7c89cc555" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.586045 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.589269 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.590158 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.590187 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.590504 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.594990 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vvrmt"] Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.723102 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcbjq\" (UniqueName: \"kubernetes.io/projected/35b9cf19-a1cd-48b5-9072-d5c71680c892-kube-api-access-lcbjq\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.723520 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.723572 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.826177 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.826256 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.826402 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcbjq\" (UniqueName: \"kubernetes.io/projected/35b9cf19-a1cd-48b5-9072-d5c71680c892-kube-api-access-lcbjq\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.830059 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.832295 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.843533 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcbjq\" (UniqueName: \"kubernetes.io/projected/35b9cf19-a1cd-48b5-9072-d5c71680c892-kube-api-access-lcbjq\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.909953 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:52 crc kubenswrapper[4781]: I0227 00:39:52.309334 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:39:52 crc kubenswrapper[4781]: I0227 00:39:52.425072 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vvrmt"] Feb 27 00:39:52 crc kubenswrapper[4781]: I0227 00:39:52.491867 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" event={"ID":"35b9cf19-a1cd-48b5-9072-d5c71680c892","Type":"ContainerStarted","Data":"ca371ab0a523aa64c50591b46a3e97f3b89b4de31e340ed13a5023bcc93c87de"} Feb 27 00:39:53 crc kubenswrapper[4781]: I0227 00:39:53.508498 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"4da315c4c7bf218d380bca00c0ade3ee72457fd61b27366edc67ffcf85618e37"} Feb 27 00:39:54 crc kubenswrapper[4781]: I0227 00:39:54.519204 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" event={"ID":"35b9cf19-a1cd-48b5-9072-d5c71680c892","Type":"ContainerStarted","Data":"b5b184577d5049b034be6a6f63b1b866cbf4d799620d3da5c03b7145ebd8f076"} Feb 27 00:39:54 crc kubenswrapper[4781]: I0227 00:39:54.539126 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" podStartSLOduration=2.910643313 podStartE2EDuration="3.539101703s" podCreationTimestamp="2026-02-27 00:39:51 +0000 UTC" firstStartedPulling="2026-02-27 00:39:52.442930739 +0000 UTC m=+2061.700470293" lastFinishedPulling="2026-02-27 00:39:53.071389129 +0000 UTC m=+2062.328928683" observedRunningTime="2026-02-27 00:39:54.533605519 +0000 UTC m=+2063.791145073" watchObservedRunningTime="2026-02-27 00:39:54.539101703 +0000 UTC m=+2063.796641247" Feb 27 00:39:59 crc kubenswrapper[4781]: I0227 00:39:59.566404 4781 generic.go:334] "Generic (PLEG): container finished" podID="35b9cf19-a1cd-48b5-9072-d5c71680c892" containerID="b5b184577d5049b034be6a6f63b1b866cbf4d799620d3da5c03b7145ebd8f076" exitCode=0 Feb 27 00:39:59 crc kubenswrapper[4781]: I0227 00:39:59.566497 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" event={"ID":"35b9cf19-a1cd-48b5-9072-d5c71680c892","Type":"ContainerDied","Data":"b5b184577d5049b034be6a6f63b1b866cbf4d799620d3da5c03b7145ebd8f076"} Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.140083 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535880-9cpwk"] Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.142287 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.144906 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.145333 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.145420 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.215254 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535880-9cpwk"] Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.217488 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc9gh\" (UniqueName: \"kubernetes.io/projected/93fc175b-7238-41ec-91f7-17cc07188100-kube-api-access-bc9gh\") pod \"auto-csr-approver-29535880-9cpwk\" (UID: \"93fc175b-7238-41ec-91f7-17cc07188100\") " pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.319068 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc9gh\" (UniqueName: \"kubernetes.io/projected/93fc175b-7238-41ec-91f7-17cc07188100-kube-api-access-bc9gh\") pod \"auto-csr-approver-29535880-9cpwk\" (UID: \"93fc175b-7238-41ec-91f7-17cc07188100\") " pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.337368 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc9gh\" (UniqueName: \"kubernetes.io/projected/93fc175b-7238-41ec-91f7-17cc07188100-kube-api-access-bc9gh\") pod \"auto-csr-approver-29535880-9cpwk\" (UID: \"93fc175b-7238-41ec-91f7-17cc07188100\") " pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.530697 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.029220 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535880-9cpwk"] Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.212419 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.347683 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-inventory-0\") pod \"35b9cf19-a1cd-48b5-9072-d5c71680c892\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.347824 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcbjq\" (UniqueName: \"kubernetes.io/projected/35b9cf19-a1cd-48b5-9072-d5c71680c892-kube-api-access-lcbjq\") pod \"35b9cf19-a1cd-48b5-9072-d5c71680c892\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.347915 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-ssh-key-openstack-edpm-ipam\") pod \"35b9cf19-a1cd-48b5-9072-d5c71680c892\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.352932 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35b9cf19-a1cd-48b5-9072-d5c71680c892-kube-api-access-lcbjq" (OuterVolumeSpecName: "kube-api-access-lcbjq") pod "35b9cf19-a1cd-48b5-9072-d5c71680c892" (UID: "35b9cf19-a1cd-48b5-9072-d5c71680c892"). InnerVolumeSpecName "kube-api-access-lcbjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.375394 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "35b9cf19-a1cd-48b5-9072-d5c71680c892" (UID: "35b9cf19-a1cd-48b5-9072-d5c71680c892"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.376457 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "35b9cf19-a1cd-48b5-9072-d5c71680c892" (UID: "35b9cf19-a1cd-48b5-9072-d5c71680c892"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.451003 4781 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.451041 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcbjq\" (UniqueName: \"kubernetes.io/projected/35b9cf19-a1cd-48b5-9072-d5c71680c892-kube-api-access-lcbjq\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.451057 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.610987 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.610977 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" event={"ID":"35b9cf19-a1cd-48b5-9072-d5c71680c892","Type":"ContainerDied","Data":"ca371ab0a523aa64c50591b46a3e97f3b89b4de31e340ed13a5023bcc93c87de"} Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.611684 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca371ab0a523aa64c50591b46a3e97f3b89b4de31e340ed13a5023bcc93c87de" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.614669 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" event={"ID":"93fc175b-7238-41ec-91f7-17cc07188100","Type":"ContainerStarted","Data":"d64226a21c9c3afbfc96f1c1e82063d6bba2c61ceba988ed9bedd7298eca0e90"} Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.658017 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts"] Feb 27 00:40:01 crc kubenswrapper[4781]: E0227 00:40:01.658427 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b9cf19-a1cd-48b5-9072-d5c71680c892" containerName="ssh-known-hosts-edpm-deployment" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.658445 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b9cf19-a1cd-48b5-9072-d5c71680c892" containerName="ssh-known-hosts-edpm-deployment" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.658685 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b9cf19-a1cd-48b5-9072-d5c71680c892" containerName="ssh-known-hosts-edpm-deployment" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.661355 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.668992 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts"] Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.717771 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.718004 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.718077 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.718505 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.756838 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8qqb\" (UniqueName: \"kubernetes.io/projected/2a7f1888-0c26-47e0-91b4-fbf07824cab4-kube-api-access-t8qqb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.756900 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.757031 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.859336 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8qqb\" (UniqueName: \"kubernetes.io/projected/2a7f1888-0c26-47e0-91b4-fbf07824cab4-kube-api-access-t8qqb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.859417 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.859489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.864502 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.867588 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.875524 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8qqb\" (UniqueName: \"kubernetes.io/projected/2a7f1888-0c26-47e0-91b4-fbf07824cab4-kube-api-access-t8qqb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:02 crc kubenswrapper[4781]: I0227 00:40:02.049572 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:02 crc kubenswrapper[4781]: I0227 00:40:02.608414 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts"] Feb 27 00:40:02 crc kubenswrapper[4781]: I0227 00:40:02.630357 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" event={"ID":"93fc175b-7238-41ec-91f7-17cc07188100","Type":"ContainerStarted","Data":"f95b25c7f6b69f37212289ff6ccaf1c8b693e043eb0635c23ef340ef5632fb12"} Feb 27 00:40:02 crc kubenswrapper[4781]: I0227 00:40:02.632875 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" event={"ID":"2a7f1888-0c26-47e0-91b4-fbf07824cab4","Type":"ContainerStarted","Data":"6121ec95ad5ae95181b3e1d0c2b155e2501f4100e8c01f3026d3c448fdecde2c"} Feb 27 00:40:02 crc kubenswrapper[4781]: I0227 00:40:02.648528 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" podStartSLOduration=1.430176149 podStartE2EDuration="2.648511212s" podCreationTimestamp="2026-02-27 00:40:00 +0000 UTC" firstStartedPulling="2026-02-27 00:40:01.041178112 +0000 UTC m=+2070.298717666" lastFinishedPulling="2026-02-27 00:40:02.259513175 +0000 UTC m=+2071.517052729" observedRunningTime="2026-02-27 00:40:02.647939857 +0000 UTC m=+2071.905479421" watchObservedRunningTime="2026-02-27 00:40:02.648511212 +0000 UTC m=+2071.906050766" Feb 27 00:40:03 crc kubenswrapper[4781]: I0227 00:40:03.652549 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" event={"ID":"2a7f1888-0c26-47e0-91b4-fbf07824cab4","Type":"ContainerStarted","Data":"681e0ec5aee424b20955c1f2f0d9d1da7fd9f3929df7cef80074d17dd5991180"} Feb 27 00:40:03 crc kubenswrapper[4781]: I0227 00:40:03.656442 4781 generic.go:334] "Generic (PLEG): container finished" podID="93fc175b-7238-41ec-91f7-17cc07188100" containerID="f95b25c7f6b69f37212289ff6ccaf1c8b693e043eb0635c23ef340ef5632fb12" exitCode=0 Feb 27 00:40:03 crc kubenswrapper[4781]: I0227 00:40:03.656488 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" event={"ID":"93fc175b-7238-41ec-91f7-17cc07188100","Type":"ContainerDied","Data":"f95b25c7f6b69f37212289ff6ccaf1c8b693e043eb0635c23ef340ef5632fb12"} Feb 27 00:40:03 crc kubenswrapper[4781]: I0227 00:40:03.713644 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" podStartSLOduration=2.2799835010000002 podStartE2EDuration="2.713598364s" podCreationTimestamp="2026-02-27 00:40:01 +0000 UTC" firstStartedPulling="2026-02-27 00:40:02.590961859 +0000 UTC m=+2071.848501413" lastFinishedPulling="2026-02-27 00:40:03.024576712 +0000 UTC m=+2072.282116276" observedRunningTime="2026-02-27 00:40:03.696398454 +0000 UTC m=+2072.953938008" watchObservedRunningTime="2026-02-27 00:40:03.713598364 +0000 UTC m=+2072.971137928" Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.207452 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.338956 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc9gh\" (UniqueName: \"kubernetes.io/projected/93fc175b-7238-41ec-91f7-17cc07188100-kube-api-access-bc9gh\") pod \"93fc175b-7238-41ec-91f7-17cc07188100\" (UID: \"93fc175b-7238-41ec-91f7-17cc07188100\") " Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.364536 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93fc175b-7238-41ec-91f7-17cc07188100-kube-api-access-bc9gh" (OuterVolumeSpecName: "kube-api-access-bc9gh") pod "93fc175b-7238-41ec-91f7-17cc07188100" (UID: "93fc175b-7238-41ec-91f7-17cc07188100"). InnerVolumeSpecName "kube-api-access-bc9gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.442293 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc9gh\" (UniqueName: \"kubernetes.io/projected/93fc175b-7238-41ec-91f7-17cc07188100-kube-api-access-bc9gh\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.676369 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" event={"ID":"93fc175b-7238-41ec-91f7-17cc07188100","Type":"ContainerDied","Data":"d64226a21c9c3afbfc96f1c1e82063d6bba2c61ceba988ed9bedd7298eca0e90"} Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.676419 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.676427 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d64226a21c9c3afbfc96f1c1e82063d6bba2c61ceba988ed9bedd7298eca0e90" Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.723763 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535874-9b4fw"] Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.732295 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535874-9b4fw"] Feb 27 00:40:07 crc kubenswrapper[4781]: I0227 00:40:07.321820 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21bdad75-a7e5-4940-9ee3-be513a55b97d" path="/var/lib/kubelet/pods/21bdad75-a7e5-4940-9ee3-be513a55b97d/volumes" Feb 27 00:40:10 crc kubenswrapper[4781]: I0227 00:40:10.030673 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6twxl"] Feb 27 00:40:10 crc kubenswrapper[4781]: I0227 00:40:10.040458 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6twxl"] Feb 27 00:40:10 crc kubenswrapper[4781]: I0227 00:40:10.740718 4781 generic.go:334] "Generic (PLEG): container finished" podID="2a7f1888-0c26-47e0-91b4-fbf07824cab4" containerID="681e0ec5aee424b20955c1f2f0d9d1da7fd9f3929df7cef80074d17dd5991180" exitCode=0 Feb 27 00:40:10 crc kubenswrapper[4781]: I0227 00:40:10.740807 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" event={"ID":"2a7f1888-0c26-47e0-91b4-fbf07824cab4","Type":"ContainerDied","Data":"681e0ec5aee424b20955c1f2f0d9d1da7fd9f3929df7cef80074d17dd5991180"} Feb 27 00:40:11 crc kubenswrapper[4781]: I0227 00:40:11.325385 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27b0d2a5-5629-42a0-8884-a5534240b356" path="/var/lib/kubelet/pods/27b0d2a5-5629-42a0-8884-a5534240b356/volumes" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.256950 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.288225 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-ssh-key-openstack-edpm-ipam\") pod \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.288456 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-inventory\") pod \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.288846 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8qqb\" (UniqueName: \"kubernetes.io/projected/2a7f1888-0c26-47e0-91b4-fbf07824cab4-kube-api-access-t8qqb\") pod \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.325475 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7f1888-0c26-47e0-91b4-fbf07824cab4-kube-api-access-t8qqb" (OuterVolumeSpecName: "kube-api-access-t8qqb") pod "2a7f1888-0c26-47e0-91b4-fbf07824cab4" (UID: "2a7f1888-0c26-47e0-91b4-fbf07824cab4"). InnerVolumeSpecName "kube-api-access-t8qqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.330728 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-inventory" (OuterVolumeSpecName: "inventory") pod "2a7f1888-0c26-47e0-91b4-fbf07824cab4" (UID: "2a7f1888-0c26-47e0-91b4-fbf07824cab4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.331256 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2a7f1888-0c26-47e0-91b4-fbf07824cab4" (UID: "2a7f1888-0c26-47e0-91b4-fbf07824cab4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.392277 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.392311 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8qqb\" (UniqueName: \"kubernetes.io/projected/2a7f1888-0c26-47e0-91b4-fbf07824cab4-kube-api-access-t8qqb\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.392328 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.761554 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" event={"ID":"2a7f1888-0c26-47e0-91b4-fbf07824cab4","Type":"ContainerDied","Data":"6121ec95ad5ae95181b3e1d0c2b155e2501f4100e8c01f3026d3c448fdecde2c"} Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.761597 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6121ec95ad5ae95181b3e1d0c2b155e2501f4100e8c01f3026d3c448fdecde2c" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.761619 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.848121 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz"] Feb 27 00:40:12 crc kubenswrapper[4781]: E0227 00:40:12.848698 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fc175b-7238-41ec-91f7-17cc07188100" containerName="oc" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.848720 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fc175b-7238-41ec-91f7-17cc07188100" containerName="oc" Feb 27 00:40:12 crc kubenswrapper[4781]: E0227 00:40:12.848760 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7f1888-0c26-47e0-91b4-fbf07824cab4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.848771 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7f1888-0c26-47e0-91b4-fbf07824cab4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.849053 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7f1888-0c26-47e0-91b4-fbf07824cab4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.849079 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fc175b-7238-41ec-91f7-17cc07188100" containerName="oc" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.850082 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.852310 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.852787 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.852976 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.856029 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.857916 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz"] Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.903504 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.903581 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.903777 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kl6n\" (UniqueName: \"kubernetes.io/projected/98c901e2-eff5-4256-9add-25d09beb51e3-kube-api-access-8kl6n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.006121 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kl6n\" (UniqueName: \"kubernetes.io/projected/98c901e2-eff5-4256-9add-25d09beb51e3-kube-api-access-8kl6n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.006278 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.006328 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.010995 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.011317 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.024704 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kl6n\" (UniqueName: \"kubernetes.io/projected/98c901e2-eff5-4256-9add-25d09beb51e3-kube-api-access-8kl6n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.186760 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.764826 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz"] Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.773412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" event={"ID":"98c901e2-eff5-4256-9add-25d09beb51e3","Type":"ContainerStarted","Data":"5c51816116cb2a00768333129536aa3bda367f597a1b5a8af5d31966b94ebe8f"} Feb 27 00:40:15 crc kubenswrapper[4781]: I0227 00:40:15.792975 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" event={"ID":"98c901e2-eff5-4256-9add-25d09beb51e3","Type":"ContainerStarted","Data":"81556b2b1512e6b2cac6ee77543475833768d132f92f55999253cefef07fe4fe"} Feb 27 00:40:15 crc kubenswrapper[4781]: I0227 00:40:15.812011 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" podStartSLOduration=2.89051899 podStartE2EDuration="3.81199201s" podCreationTimestamp="2026-02-27 00:40:12 +0000 UTC" firstStartedPulling="2026-02-27 00:40:13.763702157 +0000 UTC m=+2083.021241711" lastFinishedPulling="2026-02-27 00:40:14.685175167 +0000 UTC m=+2083.942714731" observedRunningTime="2026-02-27 00:40:15.808183161 +0000 UTC m=+2085.065722715" watchObservedRunningTime="2026-02-27 00:40:15.81199201 +0000 UTC m=+2085.069531564" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.421878 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ngjmc"] Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.424614 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.433016 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngjmc"] Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.534180 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-utilities\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.534461 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-catalog-content\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.534543 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8v97\" (UniqueName: \"kubernetes.io/projected/c5ccb94d-a0c4-4247-85cc-76049a84eef6-kube-api-access-b8v97\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.636847 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-catalog-content\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.636904 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8v97\" (UniqueName: \"kubernetes.io/projected/c5ccb94d-a0c4-4247-85cc-76049a84eef6-kube-api-access-b8v97\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.637107 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-utilities\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.637762 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-catalog-content\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.637957 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-utilities\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.661676 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8v97\" (UniqueName: \"kubernetes.io/projected/c5ccb94d-a0c4-4247-85cc-76049a84eef6-kube-api-access-b8v97\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.745959 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.881440 4781 generic.go:334] "Generic (PLEG): container finished" podID="98c901e2-eff5-4256-9add-25d09beb51e3" containerID="81556b2b1512e6b2cac6ee77543475833768d132f92f55999253cefef07fe4fe" exitCode=0 Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.881498 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" event={"ID":"98c901e2-eff5-4256-9add-25d09beb51e3","Type":"ContainerDied","Data":"81556b2b1512e6b2cac6ee77543475833768d132f92f55999253cefef07fe4fe"} Feb 27 00:40:24 crc kubenswrapper[4781]: I0227 00:40:24.233492 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngjmc"] Feb 27 00:40:24 crc kubenswrapper[4781]: I0227 00:40:24.891297 4781 generic.go:334] "Generic (PLEG): container finished" podID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerID="1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914" exitCode=0 Feb 27 00:40:24 crc kubenswrapper[4781]: I0227 00:40:24.891412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerDied","Data":"1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914"} Feb 27 00:40:24 crc kubenswrapper[4781]: I0227 00:40:24.891707 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerStarted","Data":"c55aee4887a25d0bd23791f9a694c6155337621e7dac3a4e5f392a9f73d0d36d"} Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.401001 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.482015 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-ssh-key-openstack-edpm-ipam\") pod \"98c901e2-eff5-4256-9add-25d09beb51e3\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.482143 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kl6n\" (UniqueName: \"kubernetes.io/projected/98c901e2-eff5-4256-9add-25d09beb51e3-kube-api-access-8kl6n\") pod \"98c901e2-eff5-4256-9add-25d09beb51e3\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.482186 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-inventory\") pod \"98c901e2-eff5-4256-9add-25d09beb51e3\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.487365 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c901e2-eff5-4256-9add-25d09beb51e3-kube-api-access-8kl6n" (OuterVolumeSpecName: "kube-api-access-8kl6n") pod "98c901e2-eff5-4256-9add-25d09beb51e3" (UID: "98c901e2-eff5-4256-9add-25d09beb51e3"). InnerVolumeSpecName "kube-api-access-8kl6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.510203 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "98c901e2-eff5-4256-9add-25d09beb51e3" (UID: "98c901e2-eff5-4256-9add-25d09beb51e3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.515126 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-inventory" (OuterVolumeSpecName: "inventory") pod "98c901e2-eff5-4256-9add-25d09beb51e3" (UID: "98c901e2-eff5-4256-9add-25d09beb51e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.585332 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.585373 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.585398 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kl6n\" (UniqueName: \"kubernetes.io/projected/98c901e2-eff5-4256-9add-25d09beb51e3-kube-api-access-8kl6n\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.904131 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" event={"ID":"98c901e2-eff5-4256-9add-25d09beb51e3","Type":"ContainerDied","Data":"5c51816116cb2a00768333129536aa3bda367f597a1b5a8af5d31966b94ebe8f"} Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.904485 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c51816116cb2a00768333129536aa3bda367f597a1b5a8af5d31966b94ebe8f" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.904145 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.908085 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerStarted","Data":"1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73"} Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.029484 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894"] Feb 27 00:40:26 crc kubenswrapper[4781]: E0227 00:40:26.030200 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c901e2-eff5-4256-9add-25d09beb51e3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.030223 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c901e2-eff5-4256-9add-25d09beb51e3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.030482 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c901e2-eff5-4256-9add-25d09beb51e3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.031300 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.034281 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.034776 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.034911 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.034944 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.035655 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.035941 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.036309 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.036498 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.043291 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894"] Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.094933 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095000 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095049 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095108 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095135 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095181 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095205 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095280 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095314 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095450 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkm55\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-kube-api-access-kkm55\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095511 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095537 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095615 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.197676 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.197766 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.197918 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkm55\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-kube-api-access-kkm55\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.197999 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198035 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198110 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198179 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198274 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198357 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198441 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198590 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198670 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198784 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198820 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.204409 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.204565 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.204659 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.204707 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.205430 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.205482 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.205572 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.205733 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.206341 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.206784 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.206811 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.207606 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.207870 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.216377 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkm55\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-kube-api-access-kkm55\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.351261 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.881760 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894"] Feb 27 00:40:26 crc kubenswrapper[4781]: W0227 00:40:26.882185 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dace61f_2e30_4132_9ce6_1cb1c8a6cedc.slice/crio-fdbe6dbed5dcbc407181e019c6042c54a0b20057226462faf8d516b97be9c31e WatchSource:0}: Error finding container fdbe6dbed5dcbc407181e019c6042c54a0b20057226462faf8d516b97be9c31e: Status 404 returned error can't find the container with id fdbe6dbed5dcbc407181e019c6042c54a0b20057226462faf8d516b97be9c31e Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.922285 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" event={"ID":"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc","Type":"ContainerStarted","Data":"fdbe6dbed5dcbc407181e019c6042c54a0b20057226462faf8d516b97be9c31e"} Feb 27 00:40:29 crc kubenswrapper[4781]: I0227 00:40:29.954166 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" event={"ID":"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc","Type":"ContainerStarted","Data":"963155c7618099d355cb8c863003fa537b8c82a66251e59d4f497102028cdca7"} Feb 27 00:40:29 crc kubenswrapper[4781]: I0227 00:40:29.984996 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" podStartSLOduration=2.017666099 podStartE2EDuration="3.984970239s" podCreationTimestamp="2026-02-27 00:40:26 +0000 UTC" firstStartedPulling="2026-02-27 00:40:26.885881057 +0000 UTC m=+2096.143420611" lastFinishedPulling="2026-02-27 00:40:28.853185197 +0000 UTC m=+2098.110724751" observedRunningTime="2026-02-27 00:40:29.97660397 +0000 UTC m=+2099.234143544" watchObservedRunningTime="2026-02-27 00:40:29.984970239 +0000 UTC m=+2099.242509793" Feb 27 00:40:30 crc kubenswrapper[4781]: I0227 00:40:30.643760 4781 scope.go:117] "RemoveContainer" containerID="603be41f44dabcefd367f03b819f0e12526431539cc454d1e0a0fbbe4c354d4e" Feb 27 00:40:30 crc kubenswrapper[4781]: I0227 00:40:30.708511 4781 scope.go:117] "RemoveContainer" containerID="172b3310c26572010bb7e76f998ac931b571b090edac45e7e85d3b3c5cd6c47d" Feb 27 00:40:34 crc kubenswrapper[4781]: I0227 00:40:34.998882 4781 generic.go:334] "Generic (PLEG): container finished" podID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerID="1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73" exitCode=0 Feb 27 00:40:34 crc kubenswrapper[4781]: I0227 00:40:34.998963 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerDied","Data":"1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73"} Feb 27 00:40:36 crc kubenswrapper[4781]: I0227 00:40:36.012747 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerStarted","Data":"1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058"} Feb 27 00:40:36 crc kubenswrapper[4781]: I0227 00:40:36.043543 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ngjmc" podStartSLOduration=2.534584912 podStartE2EDuration="13.043519367s" podCreationTimestamp="2026-02-27 00:40:23 +0000 UTC" firstStartedPulling="2026-02-27 00:40:24.893318648 +0000 UTC m=+2094.150858202" lastFinishedPulling="2026-02-27 00:40:35.402253103 +0000 UTC m=+2104.659792657" observedRunningTime="2026-02-27 00:40:36.034265126 +0000 UTC m=+2105.291804690" watchObservedRunningTime="2026-02-27 00:40:36.043519367 +0000 UTC m=+2105.301058921" Feb 27 00:40:43 crc kubenswrapper[4781]: I0227 00:40:43.746742 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:43 crc kubenswrapper[4781]: I0227 00:40:43.747213 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:43 crc kubenswrapper[4781]: I0227 00:40:43.794700 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:44 crc kubenswrapper[4781]: I0227 00:40:44.152570 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:44 crc kubenswrapper[4781]: I0227 00:40:44.201715 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngjmc"] Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.105792 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ngjmc" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="registry-server" containerID="cri-o://1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058" gracePeriod=2 Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.673564 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.842991 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-catalog-content\") pod \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.843546 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8v97\" (UniqueName: \"kubernetes.io/projected/c5ccb94d-a0c4-4247-85cc-76049a84eef6-kube-api-access-b8v97\") pod \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.843680 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-utilities\") pod \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.844324 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-utilities" (OuterVolumeSpecName: "utilities") pod "c5ccb94d-a0c4-4247-85cc-76049a84eef6" (UID: "c5ccb94d-a0c4-4247-85cc-76049a84eef6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.852532 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5ccb94d-a0c4-4247-85cc-76049a84eef6-kube-api-access-b8v97" (OuterVolumeSpecName: "kube-api-access-b8v97") pod "c5ccb94d-a0c4-4247-85cc-76049a84eef6" (UID: "c5ccb94d-a0c4-4247-85cc-76049a84eef6"). InnerVolumeSpecName "kube-api-access-b8v97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.945447 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8v97\" (UniqueName: \"kubernetes.io/projected/c5ccb94d-a0c4-4247-85cc-76049a84eef6-kube-api-access-b8v97\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.945486 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.980562 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5ccb94d-a0c4-4247-85cc-76049a84eef6" (UID: "c5ccb94d-a0c4-4247-85cc-76049a84eef6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.047489 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.118787 4781 generic.go:334] "Generic (PLEG): container finished" podID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerID="1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058" exitCode=0 Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.118836 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.118834 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerDied","Data":"1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058"} Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.118994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerDied","Data":"c55aee4887a25d0bd23791f9a694c6155337621e7dac3a4e5f392a9f73d0d36d"} Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.119022 4781 scope.go:117] "RemoveContainer" containerID="1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.150087 4781 scope.go:117] "RemoveContainer" containerID="1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.166419 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngjmc"] Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.175917 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ngjmc"] Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.182861 4781 scope.go:117] "RemoveContainer" containerID="1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.233600 4781 scope.go:117] "RemoveContainer" containerID="1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058" Feb 27 00:40:47 crc kubenswrapper[4781]: E0227 00:40:47.234034 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058\": container with ID starting with 1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058 not found: ID does not exist" containerID="1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.234065 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058"} err="failed to get container status \"1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058\": rpc error: code = NotFound desc = could not find container \"1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058\": container with ID starting with 1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058 not found: ID does not exist" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.234086 4781 scope.go:117] "RemoveContainer" containerID="1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73" Feb 27 00:40:47 crc kubenswrapper[4781]: E0227 00:40:47.234346 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73\": container with ID starting with 1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73 not found: ID does not exist" containerID="1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.234367 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73"} err="failed to get container status \"1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73\": rpc error: code = NotFound desc = could not find container \"1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73\": container with ID starting with 1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73 not found: ID does not exist" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.234384 4781 scope.go:117] "RemoveContainer" containerID="1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914" Feb 27 00:40:47 crc kubenswrapper[4781]: E0227 00:40:47.234692 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914\": container with ID starting with 1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914 not found: ID does not exist" containerID="1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.234710 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914"} err="failed to get container status \"1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914\": rpc error: code = NotFound desc = could not find container \"1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914\": container with ID starting with 1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914 not found: ID does not exist" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.342707 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" path="/var/lib/kubelet/pods/c5ccb94d-a0c4-4247-85cc-76049a84eef6/volumes" Feb 27 00:41:02 crc kubenswrapper[4781]: I0227 00:41:02.292986 4781 generic.go:334] "Generic (PLEG): container finished" podID="0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" containerID="963155c7618099d355cb8c863003fa537b8c82a66251e59d4f497102028cdca7" exitCode=0 Feb 27 00:41:02 crc kubenswrapper[4781]: I0227 00:41:02.293099 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" event={"ID":"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc","Type":"ContainerDied","Data":"963155c7618099d355cb8c863003fa537b8c82a66251e59d4f497102028cdca7"} Feb 27 00:41:03 crc kubenswrapper[4781]: I0227 00:41:03.872363 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022195 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022251 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-libvirt-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022344 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-inventory\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022403 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-neutron-metadata-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022439 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022482 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-repo-setup-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022549 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ssh-key-openstack-edpm-ipam\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022571 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-telemetry-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022606 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-nova-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022647 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-bootstrap-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022667 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkm55\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-kube-api-access-kkm55\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022684 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022707 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022805 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ovn-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.036986 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-kube-api-access-kkm55" (OuterVolumeSpecName: "kube-api-access-kkm55") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "kube-api-access-kkm55". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.039040 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040077 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040339 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040358 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040415 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040623 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040675 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040955 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.041613 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.041886 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.044248 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.066589 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-inventory" (OuterVolumeSpecName: "inventory") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.077466 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125024 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125066 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125080 4781 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125091 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125100 4781 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125109 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125119 4781 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125128 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125136 4781 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125145 4781 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125153 4781 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125161 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkm55\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-kube-api-access-kkm55\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125169 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125182 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.315753 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" event={"ID":"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc","Type":"ContainerDied","Data":"fdbe6dbed5dcbc407181e019c6042c54a0b20057226462faf8d516b97be9c31e"} Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.315801 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdbe6dbed5dcbc407181e019c6042c54a0b20057226462faf8d516b97be9c31e" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.315818 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.423795 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw"] Feb 27 00:41:04 crc kubenswrapper[4781]: E0227 00:41:04.424322 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="extract-utilities" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.424342 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="extract-utilities" Feb 27 00:41:04 crc kubenswrapper[4781]: E0227 00:41:04.424369 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="extract-content" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.424377 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="extract-content" Feb 27 00:41:04 crc kubenswrapper[4781]: E0227 00:41:04.424393 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.424402 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 27 00:41:04 crc kubenswrapper[4781]: E0227 00:41:04.424414 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="registry-server" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.424421 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="registry-server" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.424717 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.424748 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="registry-server" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.425691 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.429177 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.429952 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.430297 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.431619 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.432928 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.435959 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw"] Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.532768 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.532847 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.532884 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xkxl\" (UniqueName: \"kubernetes.io/projected/e61bcd0e-2490-4f8e-a429-cf07405dc01b-kube-api-access-4xkxl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.533004 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.533073 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.635130 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.635199 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xkxl\" (UniqueName: \"kubernetes.io/projected/e61bcd0e-2490-4f8e-a429-cf07405dc01b-kube-api-access-4xkxl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.635320 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.635352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.635390 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.636559 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.640056 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.642961 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.646920 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.656340 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xkxl\" (UniqueName: \"kubernetes.io/projected/e61bcd0e-2490-4f8e-a429-cf07405dc01b-kube-api-access-4xkxl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.741955 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:05 crc kubenswrapper[4781]: I0227 00:41:05.261275 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw"] Feb 27 00:41:05 crc kubenswrapper[4781]: I0227 00:41:05.336140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" event={"ID":"e61bcd0e-2490-4f8e-a429-cf07405dc01b","Type":"ContainerStarted","Data":"51c501825e1346a8a2e129063a87294aecfd1918c864ddba2f164fb624184d12"} Feb 27 00:41:06 crc kubenswrapper[4781]: I0227 00:41:06.346908 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" event={"ID":"e61bcd0e-2490-4f8e-a429-cf07405dc01b","Type":"ContainerStarted","Data":"6ed88a448040dd872eaf65d70e7642dc99ecfe2b0ddaf21643e90282bdc141d5"} Feb 27 00:41:06 crc kubenswrapper[4781]: I0227 00:41:06.372653 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" podStartSLOduration=1.946775497 podStartE2EDuration="2.372618167s" podCreationTimestamp="2026-02-27 00:41:04 +0000 UTC" firstStartedPulling="2026-02-27 00:41:05.264176523 +0000 UTC m=+2134.521716077" lastFinishedPulling="2026-02-27 00:41:05.690019193 +0000 UTC m=+2134.947558747" observedRunningTime="2026-02-27 00:41:06.36660332 +0000 UTC m=+2135.624142874" watchObservedRunningTime="2026-02-27 00:41:06.372618167 +0000 UTC m=+2135.630157721" Feb 27 00:41:24 crc kubenswrapper[4781]: I0227 00:41:24.043813 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-6wz7g"] Feb 27 00:41:24 crc kubenswrapper[4781]: I0227 00:41:24.055504 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-6wz7g"] Feb 27 00:41:25 crc kubenswrapper[4781]: I0227 00:41:25.670795 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b669382e-dffc-421d-80a3-82b928f54044" path="/var/lib/kubelet/pods/b669382e-dffc-421d-80a3-82b928f54044/volumes" Feb 27 00:41:30 crc kubenswrapper[4781]: I0227 00:41:30.804532 4781 scope.go:117] "RemoveContainer" containerID="08009d33d7dd60364f173703aa207fb7fe65cb10f22855e575d2a1e3d49e40a0" Feb 27 00:41:31 crc kubenswrapper[4781]: I0227 00:41:31.026112 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-c5vn9"] Feb 27 00:41:31 crc kubenswrapper[4781]: I0227 00:41:31.034988 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-c5vn9"] Feb 27 00:41:31 crc kubenswrapper[4781]: I0227 00:41:31.322229 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee23b33-5d55-45c9-b024-0b4865019095" path="/var/lib/kubelet/pods/fee23b33-5d55-45c9-b024-0b4865019095/volumes" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.155561 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535882-skl65"] Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.159064 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.161171 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.161532 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.162177 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.172281 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535882-skl65"] Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.303257 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhcrr\" (UniqueName: \"kubernetes.io/projected/29db339c-88ad-410b-bad1-e5f5328e9a0a-kube-api-access-vhcrr\") pod \"auto-csr-approver-29535882-skl65\" (UID: \"29db339c-88ad-410b-bad1-e5f5328e9a0a\") " pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.405681 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhcrr\" (UniqueName: \"kubernetes.io/projected/29db339c-88ad-410b-bad1-e5f5328e9a0a-kube-api-access-vhcrr\") pod \"auto-csr-approver-29535882-skl65\" (UID: \"29db339c-88ad-410b-bad1-e5f5328e9a0a\") " pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.425397 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhcrr\" (UniqueName: \"kubernetes.io/projected/29db339c-88ad-410b-bad1-e5f5328e9a0a-kube-api-access-vhcrr\") pod \"auto-csr-approver-29535882-skl65\" (UID: \"29db339c-88ad-410b-bad1-e5f5328e9a0a\") " pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.489811 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.991549 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535882-skl65"] Feb 27 00:42:01 crc kubenswrapper[4781]: I0227 00:42:01.030543 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535882-skl65" event={"ID":"29db339c-88ad-410b-bad1-e5f5328e9a0a","Type":"ContainerStarted","Data":"68c1340bfed486bf6f77881531a13b2ec6ce5bdb5c017e706d1af3feb87c99af"} Feb 27 00:42:03 crc kubenswrapper[4781]: I0227 00:42:03.063528 4781 generic.go:334] "Generic (PLEG): container finished" podID="e61bcd0e-2490-4f8e-a429-cf07405dc01b" containerID="6ed88a448040dd872eaf65d70e7642dc99ecfe2b0ddaf21643e90282bdc141d5" exitCode=0 Feb 27 00:42:03 crc kubenswrapper[4781]: I0227 00:42:03.063652 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" event={"ID":"e61bcd0e-2490-4f8e-a429-cf07405dc01b","Type":"ContainerDied","Data":"6ed88a448040dd872eaf65d70e7642dc99ecfe2b0ddaf21643e90282bdc141d5"} Feb 27 00:42:03 crc kubenswrapper[4781]: I0227 00:42:03.066412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535882-skl65" event={"ID":"29db339c-88ad-410b-bad1-e5f5328e9a0a","Type":"ContainerStarted","Data":"bcc82c4ff93196fe9d1d81964a39e384053e68533a13a500ed58309dd14ee8eb"} Feb 27 00:42:03 crc kubenswrapper[4781]: I0227 00:42:03.091894 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535882-skl65" podStartSLOduration=1.294506254 podStartE2EDuration="3.091875121s" podCreationTimestamp="2026-02-27 00:42:00 +0000 UTC" firstStartedPulling="2026-02-27 00:42:00.998616758 +0000 UTC m=+2190.256156312" lastFinishedPulling="2026-02-27 00:42:02.795985615 +0000 UTC m=+2192.053525179" observedRunningTime="2026-02-27 00:42:03.091111431 +0000 UTC m=+2192.348651005" watchObservedRunningTime="2026-02-27 00:42:03.091875121 +0000 UTC m=+2192.349414675" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.081515 4781 generic.go:334] "Generic (PLEG): container finished" podID="29db339c-88ad-410b-bad1-e5f5328e9a0a" containerID="bcc82c4ff93196fe9d1d81964a39e384053e68533a13a500ed58309dd14ee8eb" exitCode=0 Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.081652 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535882-skl65" event={"ID":"29db339c-88ad-410b-bad1-e5f5328e9a0a","Type":"ContainerDied","Data":"bcc82c4ff93196fe9d1d81964a39e384053e68533a13a500ed58309dd14ee8eb"} Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.695780 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.706519 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-inventory\") pod \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.706710 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ssh-key-openstack-edpm-ipam\") pod \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.708264 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovncontroller-config-0\") pod \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.708343 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xkxl\" (UniqueName: \"kubernetes.io/projected/e61bcd0e-2490-4f8e-a429-cf07405dc01b-kube-api-access-4xkxl\") pod \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.708501 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovn-combined-ca-bundle\") pod \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.713584 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e61bcd0e-2490-4f8e-a429-cf07405dc01b" (UID: "e61bcd0e-2490-4f8e-a429-cf07405dc01b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.716073 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61bcd0e-2490-4f8e-a429-cf07405dc01b-kube-api-access-4xkxl" (OuterVolumeSpecName: "kube-api-access-4xkxl") pod "e61bcd0e-2490-4f8e-a429-cf07405dc01b" (UID: "e61bcd0e-2490-4f8e-a429-cf07405dc01b"). InnerVolumeSpecName "kube-api-access-4xkxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.743181 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e61bcd0e-2490-4f8e-a429-cf07405dc01b" (UID: "e61bcd0e-2490-4f8e-a429-cf07405dc01b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.753148 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "e61bcd0e-2490-4f8e-a429-cf07405dc01b" (UID: "e61bcd0e-2490-4f8e-a429-cf07405dc01b"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.767267 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-inventory" (OuterVolumeSpecName: "inventory") pod "e61bcd0e-2490-4f8e-a429-cf07405dc01b" (UID: "e61bcd0e-2490-4f8e-a429-cf07405dc01b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.811932 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.811976 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.811990 4781 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.812002 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xkxl\" (UniqueName: \"kubernetes.io/projected/e61bcd0e-2490-4f8e-a429-cf07405dc01b-kube-api-access-4xkxl\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.812015 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.093689 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" event={"ID":"e61bcd0e-2490-4f8e-a429-cf07405dc01b","Type":"ContainerDied","Data":"51c501825e1346a8a2e129063a87294aecfd1918c864ddba2f164fb624184d12"} Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.093725 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.093739 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51c501825e1346a8a2e129063a87294aecfd1918c864ddba2f164fb624184d12" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.157427 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq"] Feb 27 00:42:05 crc kubenswrapper[4781]: E0227 00:42:05.158421 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61bcd0e-2490-4f8e-a429-cf07405dc01b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.158458 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61bcd0e-2490-4f8e-a429-cf07405dc01b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.158914 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61bcd0e-2490-4f8e-a429-cf07405dc01b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.161195 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.163113 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.165853 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.166102 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.166293 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.166559 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.168070 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.175219 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq"] Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.220500 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.220644 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.220766 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.220790 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.220809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.220830 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntw6w\" (UniqueName: \"kubernetes.io/projected/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-kube-api-access-ntw6w\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.324605 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.324716 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.324744 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.324781 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntw6w\" (UniqueName: \"kubernetes.io/projected/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-kube-api-access-ntw6w\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.324814 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.324917 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.329879 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.330505 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.330769 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.331660 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.338263 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.343502 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntw6w\" (UniqueName: \"kubernetes.io/projected/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-kube-api-access-ntw6w\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.497327 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.501454 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.528937 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhcrr\" (UniqueName: \"kubernetes.io/projected/29db339c-88ad-410b-bad1-e5f5328e9a0a-kube-api-access-vhcrr\") pod \"29db339c-88ad-410b-bad1-e5f5328e9a0a\" (UID: \"29db339c-88ad-410b-bad1-e5f5328e9a0a\") " Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.534570 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29db339c-88ad-410b-bad1-e5f5328e9a0a-kube-api-access-vhcrr" (OuterVolumeSpecName: "kube-api-access-vhcrr") pod "29db339c-88ad-410b-bad1-e5f5328e9a0a" (UID: "29db339c-88ad-410b-bad1-e5f5328e9a0a"). InnerVolumeSpecName "kube-api-access-vhcrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.632540 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhcrr\" (UniqueName: \"kubernetes.io/projected/29db339c-88ad-410b-bad1-e5f5328e9a0a-kube-api-access-vhcrr\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.086476 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq"] Feb 27 00:42:06 crc kubenswrapper[4781]: W0227 00:42:06.088831 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a3e8437_2d3f_44a9_bb1a_8b3de1e91c87.slice/crio-e006780cf4daccda68c6b25b213909e5856f96dd7fd97d562673bc7d27726bb2 WatchSource:0}: Error finding container e006780cf4daccda68c6b25b213909e5856f96dd7fd97d562673bc7d27726bb2: Status 404 returned error can't find the container with id e006780cf4daccda68c6b25b213909e5856f96dd7fd97d562673bc7d27726bb2 Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.108034 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535882-skl65" event={"ID":"29db339c-88ad-410b-bad1-e5f5328e9a0a","Type":"ContainerDied","Data":"68c1340bfed486bf6f77881531a13b2ec6ce5bdb5c017e706d1af3feb87c99af"} Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.108083 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68c1340bfed486bf6f77881531a13b2ec6ce5bdb5c017e706d1af3feb87c99af" Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.108051 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.112948 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" event={"ID":"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87","Type":"ContainerStarted","Data":"e006780cf4daccda68c6b25b213909e5856f96dd7fd97d562673bc7d27726bb2"} Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.166673 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535876-2l88l"] Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.176644 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535876-2l88l"] Feb 27 00:42:07 crc kubenswrapper[4781]: I0227 00:42:07.123378 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" event={"ID":"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87","Type":"ContainerStarted","Data":"e8eaf2ec603b6edc8badc09564a9675ccb658970cd78310fb0d45ee49918516f"} Feb 27 00:42:07 crc kubenswrapper[4781]: I0227 00:42:07.145933 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" podStartSLOduration=1.749421307 podStartE2EDuration="2.145914081s" podCreationTimestamp="2026-02-27 00:42:05 +0000 UTC" firstStartedPulling="2026-02-27 00:42:06.093776234 +0000 UTC m=+2195.351315788" lastFinishedPulling="2026-02-27 00:42:06.490269008 +0000 UTC m=+2195.747808562" observedRunningTime="2026-02-27 00:42:07.140360645 +0000 UTC m=+2196.397900199" watchObservedRunningTime="2026-02-27 00:42:07.145914081 +0000 UTC m=+2196.403453635" Feb 27 00:42:07 crc kubenswrapper[4781]: I0227 00:42:07.348480 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9301966-9820-4623-8393-f185a0616743" path="/var/lib/kubelet/pods/f9301966-9820-4623-8393-f185a0616743/volumes" Feb 27 00:42:12 crc kubenswrapper[4781]: I0227 00:42:12.894989 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:42:12 crc kubenswrapper[4781]: I0227 00:42:12.895501 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.125602 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j5kdg"] Feb 27 00:42:20 crc kubenswrapper[4781]: E0227 00:42:20.126796 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29db339c-88ad-410b-bad1-e5f5328e9a0a" containerName="oc" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.126812 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="29db339c-88ad-410b-bad1-e5f5328e9a0a" containerName="oc" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.127034 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="29db339c-88ad-410b-bad1-e5f5328e9a0a" containerName="oc" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.128524 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.142737 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j5kdg"] Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.167036 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpsmb\" (UniqueName: \"kubernetes.io/projected/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-kube-api-access-lpsmb\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.167345 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-utilities\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.167569 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-catalog-content\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.270147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-catalog-content\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.270262 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpsmb\" (UniqueName: \"kubernetes.io/projected/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-kube-api-access-lpsmb\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.270294 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-utilities\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.270800 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-utilities\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.271018 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-catalog-content\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.290410 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpsmb\" (UniqueName: \"kubernetes.io/projected/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-kube-api-access-lpsmb\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.449382 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:21 crc kubenswrapper[4781]: I0227 00:42:21.008237 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j5kdg"] Feb 27 00:42:21 crc kubenswrapper[4781]: I0227 00:42:21.277965 4781 generic.go:334] "Generic (PLEG): container finished" podID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerID="2fbf61290fa4aee7b5f7d7ed2e1a6d6a175da2775967ab4c314c14bc7cf150d5" exitCode=0 Feb 27 00:42:21 crc kubenswrapper[4781]: I0227 00:42:21.278083 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerDied","Data":"2fbf61290fa4aee7b5f7d7ed2e1a6d6a175da2775967ab4c314c14bc7cf150d5"} Feb 27 00:42:21 crc kubenswrapper[4781]: I0227 00:42:21.278299 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerStarted","Data":"c217e1db82494a5ce8a1988d7b8a9301ae905a7ad5c32fec2877aa3b52e831a0"} Feb 27 00:42:21 crc kubenswrapper[4781]: I0227 00:42:21.280641 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:42:22 crc kubenswrapper[4781]: I0227 00:42:22.291093 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerStarted","Data":"40593e6c39e318c4bc42b772884d5b14e6880366dad393e45d23642abc403493"} Feb 27 00:42:24 crc kubenswrapper[4781]: I0227 00:42:24.321716 4781 generic.go:334] "Generic (PLEG): container finished" podID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerID="40593e6c39e318c4bc42b772884d5b14e6880366dad393e45d23642abc403493" exitCode=0 Feb 27 00:42:24 crc kubenswrapper[4781]: I0227 00:42:24.322346 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerDied","Data":"40593e6c39e318c4bc42b772884d5b14e6880366dad393e45d23642abc403493"} Feb 27 00:42:26 crc kubenswrapper[4781]: I0227 00:42:26.343045 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerStarted","Data":"4572cbff2debbe399dc6df59ffc2c15b2ea56016116f3d018932b0bcc0eae69e"} Feb 27 00:42:26 crc kubenswrapper[4781]: I0227 00:42:26.361762 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j5kdg" podStartSLOduration=2.678562532 podStartE2EDuration="6.361741623s" podCreationTimestamp="2026-02-27 00:42:20 +0000 UTC" firstStartedPulling="2026-02-27 00:42:21.280386289 +0000 UTC m=+2210.537925843" lastFinishedPulling="2026-02-27 00:42:24.96356538 +0000 UTC m=+2214.221104934" observedRunningTime="2026-02-27 00:42:26.359113734 +0000 UTC m=+2215.616653298" watchObservedRunningTime="2026-02-27 00:42:26.361741623 +0000 UTC m=+2215.619281177" Feb 27 00:42:30 crc kubenswrapper[4781]: I0227 00:42:30.450268 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:30 crc kubenswrapper[4781]: I0227 00:42:30.452428 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:30 crc kubenswrapper[4781]: I0227 00:42:30.501509 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:30 crc kubenswrapper[4781]: I0227 00:42:30.905382 4781 scope.go:117] "RemoveContainer" containerID="53c40723095bbd1b6e5cbec68ec5b0fac1a46ad7d3ad91a7ae622222a7ca48d5" Feb 27 00:42:31 crc kubenswrapper[4781]: I0227 00:42:31.007328 4781 scope.go:117] "RemoveContainer" containerID="4c15c466d7915dc653aadd3dff0e84b4a8fd3f49a7805b84c66c98b2891abd65" Feb 27 00:42:31 crc kubenswrapper[4781]: I0227 00:42:31.435623 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:31 crc kubenswrapper[4781]: I0227 00:42:31.813114 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j5kdg"] Feb 27 00:42:33 crc kubenswrapper[4781]: I0227 00:42:33.405225 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j5kdg" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="registry-server" containerID="cri-o://4572cbff2debbe399dc6df59ffc2c15b2ea56016116f3d018932b0bcc0eae69e" gracePeriod=2 Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.422010 4781 generic.go:334] "Generic (PLEG): container finished" podID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerID="4572cbff2debbe399dc6df59ffc2c15b2ea56016116f3d018932b0bcc0eae69e" exitCode=0 Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.422145 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerDied","Data":"4572cbff2debbe399dc6df59ffc2c15b2ea56016116f3d018932b0bcc0eae69e"} Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.525940 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.690254 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-utilities\") pod \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.690359 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-catalog-content\") pod \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.690415 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpsmb\" (UniqueName: \"kubernetes.io/projected/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-kube-api-access-lpsmb\") pod \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.692408 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-utilities" (OuterVolumeSpecName: "utilities") pod "3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" (UID: "3bd52a8a-5bfd-45f3-8d26-90dcc81f2549"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.707785 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-kube-api-access-lpsmb" (OuterVolumeSpecName: "kube-api-access-lpsmb") pod "3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" (UID: "3bd52a8a-5bfd-45f3-8d26-90dcc81f2549"). InnerVolumeSpecName "kube-api-access-lpsmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.761011 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" (UID: "3bd52a8a-5bfd-45f3-8d26-90dcc81f2549"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.792397 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.792434 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.792447 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpsmb\" (UniqueName: \"kubernetes.io/projected/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-kube-api-access-lpsmb\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.435583 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerDied","Data":"c217e1db82494a5ce8a1988d7b8a9301ae905a7ad5c32fec2877aa3b52e831a0"} Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.435674 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.436331 4781 scope.go:117] "RemoveContainer" containerID="4572cbff2debbe399dc6df59ffc2c15b2ea56016116f3d018932b0bcc0eae69e" Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.462442 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j5kdg"] Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.469684 4781 scope.go:117] "RemoveContainer" containerID="40593e6c39e318c4bc42b772884d5b14e6880366dad393e45d23642abc403493" Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.471997 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j5kdg"] Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.488408 4781 scope.go:117] "RemoveContainer" containerID="2fbf61290fa4aee7b5f7d7ed2e1a6d6a175da2775967ab4c314c14bc7cf150d5" Feb 27 00:42:37 crc kubenswrapper[4781]: I0227 00:42:37.320005 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" path="/var/lib/kubelet/pods/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549/volumes" Feb 27 00:42:42 crc kubenswrapper[4781]: I0227 00:42:42.896074 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:42:42 crc kubenswrapper[4781]: I0227 00:42:42.896580 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:42:49 crc kubenswrapper[4781]: I0227 00:42:49.592724 4781 generic.go:334] "Generic (PLEG): container finished" podID="3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" containerID="e8eaf2ec603b6edc8badc09564a9675ccb658970cd78310fb0d45ee49918516f" exitCode=0 Feb 27 00:42:49 crc kubenswrapper[4781]: I0227 00:42:49.592821 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" event={"ID":"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87","Type":"ContainerDied","Data":"e8eaf2ec603b6edc8badc09564a9675ccb658970cd78310fb0d45ee49918516f"} Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.153235 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.282590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-metadata-combined-ca-bundle\") pod \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.282799 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-ssh-key-openstack-edpm-ipam\") pod \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.282866 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-nova-metadata-neutron-config-0\") pod \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.282895 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-inventory\") pod \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.282922 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntw6w\" (UniqueName: \"kubernetes.io/projected/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-kube-api-access-ntw6w\") pod \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.282940 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-ovn-metadata-agent-neutron-config-0\") pod \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.289065 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-kube-api-access-ntw6w" (OuterVolumeSpecName: "kube-api-access-ntw6w") pod "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" (UID: "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87"). InnerVolumeSpecName "kube-api-access-ntw6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.290533 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" (UID: "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.319028 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" (UID: "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.319196 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" (UID: "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.327447 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" (UID: "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.327857 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-inventory" (OuterVolumeSpecName: "inventory") pod "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" (UID: "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.386903 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.386938 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntw6w\" (UniqueName: \"kubernetes.io/projected/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-kube-api-access-ntw6w\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.386971 4781 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.386984 4781 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.386994 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.387004 4781 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.622413 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" event={"ID":"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87","Type":"ContainerDied","Data":"e006780cf4daccda68c6b25b213909e5856f96dd7fd97d562673bc7d27726bb2"} Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.622470 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e006780cf4daccda68c6b25b213909e5856f96dd7fd97d562673bc7d27726bb2" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.623726 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.711350 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c"] Feb 27 00:42:51 crc kubenswrapper[4781]: E0227 00:42:51.712003 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="extract-utilities" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712018 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="extract-utilities" Feb 27 00:42:51 crc kubenswrapper[4781]: E0227 00:42:51.712038 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712045 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 27 00:42:51 crc kubenswrapper[4781]: E0227 00:42:51.712053 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="registry-server" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712059 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="registry-server" Feb 27 00:42:51 crc kubenswrapper[4781]: E0227 00:42:51.712075 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="extract-content" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712081 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="extract-content" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712239 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="registry-server" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712265 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712973 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.715201 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.715277 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.716998 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.717155 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.717181 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.719973 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c"] Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.795248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.795361 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.795428 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.795497 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np7xl\" (UniqueName: \"kubernetes.io/projected/bd292468-b151-4004-b0b7-bd873e7e4e2d-kube-api-access-np7xl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.795545 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.896937 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.897026 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.897100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np7xl\" (UniqueName: \"kubernetes.io/projected/bd292468-b151-4004-b0b7-bd873e7e4e2d-kube-api-access-np7xl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.897155 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.897185 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.901342 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.901517 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.901828 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.902033 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.918779 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np7xl\" (UniqueName: \"kubernetes.io/projected/bd292468-b151-4004-b0b7-bd873e7e4e2d-kube-api-access-np7xl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:52 crc kubenswrapper[4781]: I0227 00:42:52.029974 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:52 crc kubenswrapper[4781]: I0227 00:42:52.603429 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c"] Feb 27 00:42:52 crc kubenswrapper[4781]: I0227 00:42:52.632332 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" event={"ID":"bd292468-b151-4004-b0b7-bd873e7e4e2d","Type":"ContainerStarted","Data":"c119cf35418cf9a52f75fa4eac36439312f59759c419c8f80f423d37df05fd2f"} Feb 27 00:42:53 crc kubenswrapper[4781]: I0227 00:42:53.645782 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" event={"ID":"bd292468-b151-4004-b0b7-bd873e7e4e2d","Type":"ContainerStarted","Data":"82f87db0afeb37c294b7dd4a8934c5d99082b1d59480c43a23f358b6efcac0cb"} Feb 27 00:42:53 crc kubenswrapper[4781]: I0227 00:42:53.665746 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" podStartSLOduration=2.241658892 podStartE2EDuration="2.665725931s" podCreationTimestamp="2026-02-27 00:42:51 +0000 UTC" firstStartedPulling="2026-02-27 00:42:52.607098864 +0000 UTC m=+2241.864638418" lastFinishedPulling="2026-02-27 00:42:53.031165873 +0000 UTC m=+2242.288705457" observedRunningTime="2026-02-27 00:42:53.664841998 +0000 UTC m=+2242.922381562" watchObservedRunningTime="2026-02-27 00:42:53.665725931 +0000 UTC m=+2242.923265485" Feb 27 00:43:12 crc kubenswrapper[4781]: I0227 00:43:12.895380 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:43:12 crc kubenswrapper[4781]: I0227 00:43:12.896270 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:43:12 crc kubenswrapper[4781]: I0227 00:43:12.896319 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:43:12 crc kubenswrapper[4781]: I0227 00:43:12.897052 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4da315c4c7bf218d380bca00c0ade3ee72457fd61b27366edc67ffcf85618e37"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:43:12 crc kubenswrapper[4781]: I0227 00:43:12.897114 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://4da315c4c7bf218d380bca00c0ade3ee72457fd61b27366edc67ffcf85618e37" gracePeriod=600 Feb 27 00:43:13 crc kubenswrapper[4781]: I0227 00:43:13.843756 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="4da315c4c7bf218d380bca00c0ade3ee72457fd61b27366edc67ffcf85618e37" exitCode=0 Feb 27 00:43:13 crc kubenswrapper[4781]: I0227 00:43:13.843828 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"4da315c4c7bf218d380bca00c0ade3ee72457fd61b27366edc67ffcf85618e37"} Feb 27 00:43:13 crc kubenswrapper[4781]: I0227 00:43:13.844669 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084"} Feb 27 00:43:13 crc kubenswrapper[4781]: I0227 00:43:13.844702 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.159824 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535884-8t2lb"] Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.161803 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.164977 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.165015 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.165136 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.177739 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535884-8t2lb"] Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.233024 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xndw\" (UniqueName: \"kubernetes.io/projected/018f4ff5-f081-4257-8189-3eb14ea035f3-kube-api-access-8xndw\") pod \"auto-csr-approver-29535884-8t2lb\" (UID: \"018f4ff5-f081-4257-8189-3eb14ea035f3\") " pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.335263 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xndw\" (UniqueName: \"kubernetes.io/projected/018f4ff5-f081-4257-8189-3eb14ea035f3-kube-api-access-8xndw\") pod \"auto-csr-approver-29535884-8t2lb\" (UID: \"018f4ff5-f081-4257-8189-3eb14ea035f3\") " pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.353107 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xndw\" (UniqueName: \"kubernetes.io/projected/018f4ff5-f081-4257-8189-3eb14ea035f3-kube-api-access-8xndw\") pod \"auto-csr-approver-29535884-8t2lb\" (UID: \"018f4ff5-f081-4257-8189-3eb14ea035f3\") " pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.486110 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.968398 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535884-8t2lb"] Feb 27 00:44:01 crc kubenswrapper[4781]: I0227 00:44:01.307764 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" event={"ID":"018f4ff5-f081-4257-8189-3eb14ea035f3","Type":"ContainerStarted","Data":"f7470c5bf777d76181cd4c7a6803e2c6ed79b2d14788346a445f0ea22ee049cd"} Feb 27 00:44:02 crc kubenswrapper[4781]: I0227 00:44:02.318704 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" event={"ID":"018f4ff5-f081-4257-8189-3eb14ea035f3","Type":"ContainerStarted","Data":"26e013582f5ee2e314ebc2f4329b87db88bd3251fee9e3e932b5b02ee387f73b"} Feb 27 00:44:02 crc kubenswrapper[4781]: I0227 00:44:02.342770 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" podStartSLOduration=1.394038211 podStartE2EDuration="2.342751616s" podCreationTimestamp="2026-02-27 00:44:00 +0000 UTC" firstStartedPulling="2026-02-27 00:44:00.969554041 +0000 UTC m=+2310.227093595" lastFinishedPulling="2026-02-27 00:44:01.918267446 +0000 UTC m=+2311.175807000" observedRunningTime="2026-02-27 00:44:02.335250429 +0000 UTC m=+2311.592789983" watchObservedRunningTime="2026-02-27 00:44:02.342751616 +0000 UTC m=+2311.600291170" Feb 27 00:44:03 crc kubenswrapper[4781]: I0227 00:44:03.356345 4781 generic.go:334] "Generic (PLEG): container finished" podID="018f4ff5-f081-4257-8189-3eb14ea035f3" containerID="26e013582f5ee2e314ebc2f4329b87db88bd3251fee9e3e932b5b02ee387f73b" exitCode=0 Feb 27 00:44:03 crc kubenswrapper[4781]: I0227 00:44:03.357735 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" event={"ID":"018f4ff5-f081-4257-8189-3eb14ea035f3","Type":"ContainerDied","Data":"26e013582f5ee2e314ebc2f4329b87db88bd3251fee9e3e932b5b02ee387f73b"} Feb 27 00:44:04 crc kubenswrapper[4781]: I0227 00:44:04.833692 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:04 crc kubenswrapper[4781]: I0227 00:44:04.933216 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xndw\" (UniqueName: \"kubernetes.io/projected/018f4ff5-f081-4257-8189-3eb14ea035f3-kube-api-access-8xndw\") pod \"018f4ff5-f081-4257-8189-3eb14ea035f3\" (UID: \"018f4ff5-f081-4257-8189-3eb14ea035f3\") " Feb 27 00:44:04 crc kubenswrapper[4781]: I0227 00:44:04.939757 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/018f4ff5-f081-4257-8189-3eb14ea035f3-kube-api-access-8xndw" (OuterVolumeSpecName: "kube-api-access-8xndw") pod "018f4ff5-f081-4257-8189-3eb14ea035f3" (UID: "018f4ff5-f081-4257-8189-3eb14ea035f3"). InnerVolumeSpecName "kube-api-access-8xndw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:44:05 crc kubenswrapper[4781]: I0227 00:44:05.035480 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xndw\" (UniqueName: \"kubernetes.io/projected/018f4ff5-f081-4257-8189-3eb14ea035f3-kube-api-access-8xndw\") on node \"crc\" DevicePath \"\"" Feb 27 00:44:05 crc kubenswrapper[4781]: I0227 00:44:05.376439 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" event={"ID":"018f4ff5-f081-4257-8189-3eb14ea035f3","Type":"ContainerDied","Data":"f7470c5bf777d76181cd4c7a6803e2c6ed79b2d14788346a445f0ea22ee049cd"} Feb 27 00:44:05 crc kubenswrapper[4781]: I0227 00:44:05.376477 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7470c5bf777d76181cd4c7a6803e2c6ed79b2d14788346a445f0ea22ee049cd" Feb 27 00:44:05 crc kubenswrapper[4781]: I0227 00:44:05.376519 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:05 crc kubenswrapper[4781]: I0227 00:44:05.422367 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535878-49z87"] Feb 27 00:44:05 crc kubenswrapper[4781]: I0227 00:44:05.431648 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535878-49z87"] Feb 27 00:44:07 crc kubenswrapper[4781]: I0227 00:44:07.324954 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b63206fe-04b3-4f07-a4cb-f8fd89645931" path="/var/lib/kubelet/pods/b63206fe-04b3-4f07-a4cb-f8fd89645931/volumes" Feb 27 00:44:31 crc kubenswrapper[4781]: I0227 00:44:31.177980 4781 scope.go:117] "RemoveContainer" containerID="e098a22e98e83ab04db629aad7e6384885fe2b771dad33544e78c6562872ae4e" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.149858 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2"] Feb 27 00:45:00 crc kubenswrapper[4781]: E0227 00:45:00.150785 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018f4ff5-f081-4257-8189-3eb14ea035f3" containerName="oc" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.150798 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="018f4ff5-f081-4257-8189-3eb14ea035f3" containerName="oc" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.151000 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="018f4ff5-f081-4257-8189-3eb14ea035f3" containerName="oc" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.151842 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.154290 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.154381 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.163250 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2"] Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.179867 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-secret-volume\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.179983 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4l5d\" (UniqueName: \"kubernetes.io/projected/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-kube-api-access-l4l5d\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.180085 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-config-volume\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.281512 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4l5d\" (UniqueName: \"kubernetes.io/projected/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-kube-api-access-l4l5d\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.281672 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-config-volume\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.281795 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-secret-volume\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.282581 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-config-volume\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.287014 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-secret-volume\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.298580 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4l5d\" (UniqueName: \"kubernetes.io/projected/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-kube-api-access-l4l5d\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.475778 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.925986 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2"] Feb 27 00:45:01 crc kubenswrapper[4781]: I0227 00:45:01.884545 4781 generic.go:334] "Generic (PLEG): container finished" podID="a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" containerID="13bcf8d94b2a16937b07dfe8f4ce503b88a240b7d9c23876edfc03e06b4dceeb" exitCode=0 Feb 27 00:45:01 crc kubenswrapper[4781]: I0227 00:45:01.884762 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" event={"ID":"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db","Type":"ContainerDied","Data":"13bcf8d94b2a16937b07dfe8f4ce503b88a240b7d9c23876edfc03e06b4dceeb"} Feb 27 00:45:01 crc kubenswrapper[4781]: I0227 00:45:01.885262 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" event={"ID":"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db","Type":"ContainerStarted","Data":"d25cc4cd235d26dd009c8b84749f6ad9cdbca5cf44724fba98ad630d5bb5c967"} Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.319842 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.443444 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-secret-volume\") pod \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.443656 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4l5d\" (UniqueName: \"kubernetes.io/projected/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-kube-api-access-l4l5d\") pod \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.443717 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-config-volume\") pod \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.444458 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-config-volume" (OuterVolumeSpecName: "config-volume") pod "a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" (UID: "a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.449388 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-kube-api-access-l4l5d" (OuterVolumeSpecName: "kube-api-access-l4l5d") pod "a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" (UID: "a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db"). InnerVolumeSpecName "kube-api-access-l4l5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.449800 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" (UID: "a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.546289 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.546327 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4l5d\" (UniqueName: \"kubernetes.io/projected/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-kube-api-access-l4l5d\") on node \"crc\" DevicePath \"\"" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.546338 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.916702 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" event={"ID":"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db","Type":"ContainerDied","Data":"d25cc4cd235d26dd009c8b84749f6ad9cdbca5cf44724fba98ad630d5bb5c967"} Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.916741 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d25cc4cd235d26dd009c8b84749f6ad9cdbca5cf44724fba98ad630d5bb5c967" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.916790 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:04 crc kubenswrapper[4781]: I0227 00:45:04.391045 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm"] Feb 27 00:45:04 crc kubenswrapper[4781]: I0227 00:45:04.400224 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm"] Feb 27 00:45:05 crc kubenswrapper[4781]: I0227 00:45:05.324134 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="678f27fc-d210-4a4f-bd73-090378740da9" path="/var/lib/kubelet/pods/678f27fc-d210-4a4f-bd73-090378740da9/volumes" Feb 27 00:45:31 crc kubenswrapper[4781]: I0227 00:45:31.254825 4781 scope.go:117] "RemoveContainer" containerID="898ccef1da25e7c00fcd11040419fe4b505ada16cb26d62d9a4806872cb68348" Feb 27 00:45:42 crc kubenswrapper[4781]: I0227 00:45:42.894994 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:45:42 crc kubenswrapper[4781]: I0227 00:45:42.895802 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.172007 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535886-5khtq"] Feb 27 00:46:00 crc kubenswrapper[4781]: E0227 00:46:00.175392 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" containerName="collect-profiles" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.175445 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" containerName="collect-profiles" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.176052 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" containerName="collect-profiles" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.177237 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.182941 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.183733 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.184437 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.204223 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535886-5khtq"] Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.312701 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m52c4\" (UniqueName: \"kubernetes.io/projected/8143ddb0-990c-4f1e-9130-7ca30776e64b-kube-api-access-m52c4\") pod \"auto-csr-approver-29535886-5khtq\" (UID: \"8143ddb0-990c-4f1e-9130-7ca30776e64b\") " pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.414360 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m52c4\" (UniqueName: \"kubernetes.io/projected/8143ddb0-990c-4f1e-9130-7ca30776e64b-kube-api-access-m52c4\") pod \"auto-csr-approver-29535886-5khtq\" (UID: \"8143ddb0-990c-4f1e-9130-7ca30776e64b\") " pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.433012 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m52c4\" (UniqueName: \"kubernetes.io/projected/8143ddb0-990c-4f1e-9130-7ca30776e64b-kube-api-access-m52c4\") pod \"auto-csr-approver-29535886-5khtq\" (UID: \"8143ddb0-990c-4f1e-9130-7ca30776e64b\") " pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.522213 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.978061 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535886-5khtq"] Feb 27 00:46:01 crc kubenswrapper[4781]: I0227 00:46:01.481557 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535886-5khtq" event={"ID":"8143ddb0-990c-4f1e-9130-7ca30776e64b","Type":"ContainerStarted","Data":"088040718b7f1e23e576481b5b67af8b3f210dfca038e331cb9c81b5567e1956"} Feb 27 00:46:02 crc kubenswrapper[4781]: I0227 00:46:02.492452 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535886-5khtq" event={"ID":"8143ddb0-990c-4f1e-9130-7ca30776e64b","Type":"ContainerStarted","Data":"e0bb531ca8e9ee4c1a35ccb62422bfe50af2c334314f4bd145d5137b8ad741e6"} Feb 27 00:46:02 crc kubenswrapper[4781]: I0227 00:46:02.517348 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535886-5khtq" podStartSLOduration=1.455489765 podStartE2EDuration="2.517331706s" podCreationTimestamp="2026-02-27 00:46:00 +0000 UTC" firstStartedPulling="2026-02-27 00:46:00.989729086 +0000 UTC m=+2430.247268640" lastFinishedPulling="2026-02-27 00:46:02.051571027 +0000 UTC m=+2431.309110581" observedRunningTime="2026-02-27 00:46:02.511103672 +0000 UTC m=+2431.768643226" watchObservedRunningTime="2026-02-27 00:46:02.517331706 +0000 UTC m=+2431.774871260" Feb 27 00:46:03 crc kubenswrapper[4781]: I0227 00:46:03.501695 4781 generic.go:334] "Generic (PLEG): container finished" podID="8143ddb0-990c-4f1e-9130-7ca30776e64b" containerID="e0bb531ca8e9ee4c1a35ccb62422bfe50af2c334314f4bd145d5137b8ad741e6" exitCode=0 Feb 27 00:46:03 crc kubenswrapper[4781]: I0227 00:46:03.501768 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535886-5khtq" event={"ID":"8143ddb0-990c-4f1e-9130-7ca30776e64b","Type":"ContainerDied","Data":"e0bb531ca8e9ee4c1a35ccb62422bfe50af2c334314f4bd145d5137b8ad741e6"} Feb 27 00:46:04 crc kubenswrapper[4781]: I0227 00:46:04.904082 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.012268 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m52c4\" (UniqueName: \"kubernetes.io/projected/8143ddb0-990c-4f1e-9130-7ca30776e64b-kube-api-access-m52c4\") pod \"8143ddb0-990c-4f1e-9130-7ca30776e64b\" (UID: \"8143ddb0-990c-4f1e-9130-7ca30776e64b\") " Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.018990 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8143ddb0-990c-4f1e-9130-7ca30776e64b-kube-api-access-m52c4" (OuterVolumeSpecName: "kube-api-access-m52c4") pod "8143ddb0-990c-4f1e-9130-7ca30776e64b" (UID: "8143ddb0-990c-4f1e-9130-7ca30776e64b"). InnerVolumeSpecName "kube-api-access-m52c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.115587 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m52c4\" (UniqueName: \"kubernetes.io/projected/8143ddb0-990c-4f1e-9130-7ca30776e64b-kube-api-access-m52c4\") on node \"crc\" DevicePath \"\"" Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.528007 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535886-5khtq" event={"ID":"8143ddb0-990c-4f1e-9130-7ca30776e64b","Type":"ContainerDied","Data":"088040718b7f1e23e576481b5b67af8b3f210dfca038e331cb9c81b5567e1956"} Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.528069 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="088040718b7f1e23e576481b5b67af8b3f210dfca038e331cb9c81b5567e1956" Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.528079 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.588372 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535880-9cpwk"] Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.597487 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535880-9cpwk"] Feb 27 00:46:07 crc kubenswrapper[4781]: I0227 00:46:07.321913 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93fc175b-7238-41ec-91f7-17cc07188100" path="/var/lib/kubelet/pods/93fc175b-7238-41ec-91f7-17cc07188100/volumes" Feb 27 00:46:12 crc kubenswrapper[4781]: I0227 00:46:12.895323 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:46:12 crc kubenswrapper[4781]: I0227 00:46:12.896882 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:46:30 crc kubenswrapper[4781]: I0227 00:46:30.765173 4781 generic.go:334] "Generic (PLEG): container finished" podID="bd292468-b151-4004-b0b7-bd873e7e4e2d" containerID="82f87db0afeb37c294b7dd4a8934c5d99082b1d59480c43a23f358b6efcac0cb" exitCode=0 Feb 27 00:46:30 crc kubenswrapper[4781]: I0227 00:46:30.765266 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" event={"ID":"bd292468-b151-4004-b0b7-bd873e7e4e2d","Type":"ContainerDied","Data":"82f87db0afeb37c294b7dd4a8934c5d99082b1d59480c43a23f358b6efcac0cb"} Feb 27 00:46:31 crc kubenswrapper[4781]: I0227 00:46:31.354956 4781 scope.go:117] "RemoveContainer" containerID="f95b25c7f6b69f37212289ff6ccaf1c8b693e043eb0635c23ef340ef5632fb12" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.322375 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.395787 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-secret-0\") pod \"bd292468-b151-4004-b0b7-bd873e7e4e2d\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.396328 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-inventory\") pod \"bd292468-b151-4004-b0b7-bd873e7e4e2d\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.396430 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np7xl\" (UniqueName: \"kubernetes.io/projected/bd292468-b151-4004-b0b7-bd873e7e4e2d-kube-api-access-np7xl\") pod \"bd292468-b151-4004-b0b7-bd873e7e4e2d\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.396464 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-ssh-key-openstack-edpm-ipam\") pod \"bd292468-b151-4004-b0b7-bd873e7e4e2d\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.396501 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-combined-ca-bundle\") pod \"bd292468-b151-4004-b0b7-bd873e7e4e2d\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.403480 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "bd292468-b151-4004-b0b7-bd873e7e4e2d" (UID: "bd292468-b151-4004-b0b7-bd873e7e4e2d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.419047 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd292468-b151-4004-b0b7-bd873e7e4e2d-kube-api-access-np7xl" (OuterVolumeSpecName: "kube-api-access-np7xl") pod "bd292468-b151-4004-b0b7-bd873e7e4e2d" (UID: "bd292468-b151-4004-b0b7-bd873e7e4e2d"). InnerVolumeSpecName "kube-api-access-np7xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.427140 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bd292468-b151-4004-b0b7-bd873e7e4e2d" (UID: "bd292468-b151-4004-b0b7-bd873e7e4e2d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.432123 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "bd292468-b151-4004-b0b7-bd873e7e4e2d" (UID: "bd292468-b151-4004-b0b7-bd873e7e4e2d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.437778 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-inventory" (OuterVolumeSpecName: "inventory") pod "bd292468-b151-4004-b0b7-bd873e7e4e2d" (UID: "bd292468-b151-4004-b0b7-bd873e7e4e2d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.499361 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.499400 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np7xl\" (UniqueName: \"kubernetes.io/projected/bd292468-b151-4004-b0b7-bd873e7e4e2d-kube-api-access-np7xl\") on node \"crc\" DevicePath \"\"" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.499413 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.499423 4781 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.499434 4781 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.787598 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" event={"ID":"bd292468-b151-4004-b0b7-bd873e7e4e2d","Type":"ContainerDied","Data":"c119cf35418cf9a52f75fa4eac36439312f59759c419c8f80f423d37df05fd2f"} Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.787673 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c119cf35418cf9a52f75fa4eac36439312f59759c419c8f80f423d37df05fd2f" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.787669 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.880380 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h"] Feb 27 00:46:32 crc kubenswrapper[4781]: E0227 00:46:32.880818 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8143ddb0-990c-4f1e-9130-7ca30776e64b" containerName="oc" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.880835 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8143ddb0-990c-4f1e-9130-7ca30776e64b" containerName="oc" Feb 27 00:46:32 crc kubenswrapper[4781]: E0227 00:46:32.880849 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd292468-b151-4004-b0b7-bd873e7e4e2d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.880856 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd292468-b151-4004-b0b7-bd873e7e4e2d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.881070 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8143ddb0-990c-4f1e-9130-7ca30776e64b" containerName="oc" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.881090 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd292468-b151-4004-b0b7-bd873e7e4e2d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.881795 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.885443 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.885722 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.886914 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.887257 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.887456 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.887732 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.889363 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.892710 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h"] Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.909895 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.909971 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910023 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910266 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910362 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910460 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7qk5\" (UniqueName: \"kubernetes.io/projected/d3f8abc3-17b4-4d88-890e-85304a100a97-kube-api-access-n7qk5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910494 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910524 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910547 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910728 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910795 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013456 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013547 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013599 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7qk5\" (UniqueName: \"kubernetes.io/projected/d3f8abc3-17b4-4d88-890e-85304a100a97-kube-api-access-n7qk5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013622 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013661 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013677 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013704 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013726 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013762 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013787 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013801 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.016978 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.018123 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.018532 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.018547 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.018709 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.018841 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.019850 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.021295 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.021471 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.022713 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.042390 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7qk5\" (UniqueName: \"kubernetes.io/projected/d3f8abc3-17b4-4d88-890e-85304a100a97-kube-api-access-n7qk5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.214184 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:34 crc kubenswrapper[4781]: I0227 00:46:34.597106 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h"] Feb 27 00:46:34 crc kubenswrapper[4781]: I0227 00:46:34.807181 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" event={"ID":"d3f8abc3-17b4-4d88-890e-85304a100a97","Type":"ContainerStarted","Data":"0f40d07d67261eda4bba8df0dd754f507383635699da4a2039a40542ef874ffe"} Feb 27 00:46:35 crc kubenswrapper[4781]: I0227 00:46:35.818972 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" event={"ID":"d3f8abc3-17b4-4d88-890e-85304a100a97","Type":"ContainerStarted","Data":"d3e6c31e59c8273a4822b6ba92413f35b26f4d3e1b11014494798bec77bd763c"} Feb 27 00:46:35 crc kubenswrapper[4781]: I0227 00:46:35.843257 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" podStartSLOduration=3.425861555 podStartE2EDuration="3.8432386s" podCreationTimestamp="2026-02-27 00:46:32 +0000 UTC" firstStartedPulling="2026-02-27 00:46:34.59959932 +0000 UTC m=+2463.857138874" lastFinishedPulling="2026-02-27 00:46:35.016976365 +0000 UTC m=+2464.274515919" observedRunningTime="2026-02-27 00:46:35.835396904 +0000 UTC m=+2465.092936478" watchObservedRunningTime="2026-02-27 00:46:35.8432386 +0000 UTC m=+2465.100778154" Feb 27 00:46:42 crc kubenswrapper[4781]: I0227 00:46:42.895933 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:46:42 crc kubenswrapper[4781]: I0227 00:46:42.896564 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:46:42 crc kubenswrapper[4781]: I0227 00:46:42.896605 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:46:42 crc kubenswrapper[4781]: I0227 00:46:42.897353 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:46:42 crc kubenswrapper[4781]: I0227 00:46:42.897396 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" gracePeriod=600 Feb 27 00:46:43 crc kubenswrapper[4781]: E0227 00:46:43.027793 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:46:43 crc kubenswrapper[4781]: I0227 00:46:43.893613 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" exitCode=0 Feb 27 00:46:43 crc kubenswrapper[4781]: I0227 00:46:43.893764 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084"} Feb 27 00:46:43 crc kubenswrapper[4781]: I0227 00:46:43.894070 4781 scope.go:117] "RemoveContainer" containerID="4da315c4c7bf218d380bca00c0ade3ee72457fd61b27366edc67ffcf85618e37" Feb 27 00:46:43 crc kubenswrapper[4781]: I0227 00:46:43.895770 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:46:43 crc kubenswrapper[4781]: E0227 00:46:43.896269 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:46:55 crc kubenswrapper[4781]: I0227 00:46:55.310785 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:46:55 crc kubenswrapper[4781]: E0227 00:46:55.312096 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:47:10 crc kubenswrapper[4781]: I0227 00:47:10.309352 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:47:10 crc kubenswrapper[4781]: E0227 00:47:10.310277 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:47:22 crc kubenswrapper[4781]: I0227 00:47:22.309821 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:47:22 crc kubenswrapper[4781]: E0227 00:47:22.310512 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:47:34 crc kubenswrapper[4781]: I0227 00:47:34.309291 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:47:34 crc kubenswrapper[4781]: E0227 00:47:34.311021 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:47:46 crc kubenswrapper[4781]: I0227 00:47:46.310030 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:47:46 crc kubenswrapper[4781]: E0227 00:47:46.312568 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.149946 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535888-nb28f"] Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.151817 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.155308 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.155376 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.155456 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.172924 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535888-nb28f"] Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.194848 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lh24\" (UniqueName: \"kubernetes.io/projected/231f1edd-305c-4a6c-bd4e-11c12c2ae515-kube-api-access-9lh24\") pod \"auto-csr-approver-29535888-nb28f\" (UID: \"231f1edd-305c-4a6c-bd4e-11c12c2ae515\") " pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.299272 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lh24\" (UniqueName: \"kubernetes.io/projected/231f1edd-305c-4a6c-bd4e-11c12c2ae515-kube-api-access-9lh24\") pod \"auto-csr-approver-29535888-nb28f\" (UID: \"231f1edd-305c-4a6c-bd4e-11c12c2ae515\") " pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.313812 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:48:00 crc kubenswrapper[4781]: E0227 00:48:00.314061 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.327545 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lh24\" (UniqueName: \"kubernetes.io/projected/231f1edd-305c-4a6c-bd4e-11c12c2ae515-kube-api-access-9lh24\") pod \"auto-csr-approver-29535888-nb28f\" (UID: \"231f1edd-305c-4a6c-bd4e-11c12c2ae515\") " pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.473990 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.958679 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535888-nb28f"] Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.962238 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:48:01 crc kubenswrapper[4781]: I0227 00:48:01.621209 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535888-nb28f" event={"ID":"231f1edd-305c-4a6c-bd4e-11c12c2ae515","Type":"ContainerStarted","Data":"a0c41d90becd6a07f952648077ccd76199df70e8b044cb971027324554b510b6"} Feb 27 00:48:02 crc kubenswrapper[4781]: I0227 00:48:02.632492 4781 generic.go:334] "Generic (PLEG): container finished" podID="231f1edd-305c-4a6c-bd4e-11c12c2ae515" containerID="2518570ffdceb97ceb198f4ca24bb08d3d0c202488b87c6e1650891fc7084042" exitCode=0 Feb 27 00:48:02 crc kubenswrapper[4781]: I0227 00:48:02.632558 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535888-nb28f" event={"ID":"231f1edd-305c-4a6c-bd4e-11c12c2ae515","Type":"ContainerDied","Data":"2518570ffdceb97ceb198f4ca24bb08d3d0c202488b87c6e1650891fc7084042"} Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.104359 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.191128 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lh24\" (UniqueName: \"kubernetes.io/projected/231f1edd-305c-4a6c-bd4e-11c12c2ae515-kube-api-access-9lh24\") pod \"231f1edd-305c-4a6c-bd4e-11c12c2ae515\" (UID: \"231f1edd-305c-4a6c-bd4e-11c12c2ae515\") " Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.209565 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231f1edd-305c-4a6c-bd4e-11c12c2ae515-kube-api-access-9lh24" (OuterVolumeSpecName: "kube-api-access-9lh24") pod "231f1edd-305c-4a6c-bd4e-11c12c2ae515" (UID: "231f1edd-305c-4a6c-bd4e-11c12c2ae515"). InnerVolumeSpecName "kube-api-access-9lh24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.293485 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lh24\" (UniqueName: \"kubernetes.io/projected/231f1edd-305c-4a6c-bd4e-11c12c2ae515-kube-api-access-9lh24\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.653897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535888-nb28f" event={"ID":"231f1edd-305c-4a6c-bd4e-11c12c2ae515","Type":"ContainerDied","Data":"a0c41d90becd6a07f952648077ccd76199df70e8b044cb971027324554b510b6"} Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.653952 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0c41d90becd6a07f952648077ccd76199df70e8b044cb971027324554b510b6" Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.654013 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.187870 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535882-skl65"] Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.201878 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535882-skl65"] Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.321287 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29db339c-88ad-410b-bad1-e5f5328e9a0a" path="/var/lib/kubelet/pods/29db339c-88ad-410b-bad1-e5f5328e9a0a/volumes" Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.887512 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vpfn4"] Feb 27 00:48:05 crc kubenswrapper[4781]: E0227 00:48:05.888085 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231f1edd-305c-4a6c-bd4e-11c12c2ae515" containerName="oc" Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.888108 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="231f1edd-305c-4a6c-bd4e-11c12c2ae515" containerName="oc" Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.888346 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="231f1edd-305c-4a6c-bd4e-11c12c2ae515" containerName="oc" Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.890521 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.899876 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpfn4"] Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.026648 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-utilities\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.026810 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f97rp\" (UniqueName: \"kubernetes.io/projected/577f63dd-8b20-434a-ae9b-3d9589f08ccf-kube-api-access-f97rp\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.026939 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-catalog-content\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.129042 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-utilities\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.129177 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f97rp\" (UniqueName: \"kubernetes.io/projected/577f63dd-8b20-434a-ae9b-3d9589f08ccf-kube-api-access-f97rp\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.129304 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-catalog-content\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.129689 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-utilities\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.129926 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-catalog-content\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.149435 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f97rp\" (UniqueName: \"kubernetes.io/projected/577f63dd-8b20-434a-ae9b-3d9589f08ccf-kube-api-access-f97rp\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.211338 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: W0227 00:48:06.756147 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod577f63dd_8b20_434a_ae9b_3d9589f08ccf.slice/crio-b9571a6077784c10dc10f222b8867d8ccc892e934abbc12de5aadb40b3104057 WatchSource:0}: Error finding container b9571a6077784c10dc10f222b8867d8ccc892e934abbc12de5aadb40b3104057: Status 404 returned error can't find the container with id b9571a6077784c10dc10f222b8867d8ccc892e934abbc12de5aadb40b3104057 Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.758773 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpfn4"] Feb 27 00:48:07 crc kubenswrapper[4781]: I0227 00:48:07.687459 4781 generic.go:334] "Generic (PLEG): container finished" podID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerID="bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260" exitCode=0 Feb 27 00:48:07 crc kubenswrapper[4781]: I0227 00:48:07.687853 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpfn4" event={"ID":"577f63dd-8b20-434a-ae9b-3d9589f08ccf","Type":"ContainerDied","Data":"bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260"} Feb 27 00:48:07 crc kubenswrapper[4781]: I0227 00:48:07.687885 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpfn4" event={"ID":"577f63dd-8b20-434a-ae9b-3d9589f08ccf","Type":"ContainerStarted","Data":"b9571a6077784c10dc10f222b8867d8ccc892e934abbc12de5aadb40b3104057"} Feb 27 00:48:09 crc kubenswrapper[4781]: I0227 00:48:09.715504 4781 generic.go:334] "Generic (PLEG): container finished" podID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerID="66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c" exitCode=0 Feb 27 00:48:09 crc kubenswrapper[4781]: I0227 00:48:09.715598 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpfn4" event={"ID":"577f63dd-8b20-434a-ae9b-3d9589f08ccf","Type":"ContainerDied","Data":"66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c"} Feb 27 00:48:10 crc kubenswrapper[4781]: I0227 00:48:10.726490 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpfn4" event={"ID":"577f63dd-8b20-434a-ae9b-3d9589f08ccf","Type":"ContainerStarted","Data":"7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac"} Feb 27 00:48:10 crc kubenswrapper[4781]: I0227 00:48:10.750248 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vpfn4" podStartSLOduration=3.333062849 podStartE2EDuration="5.750224573s" podCreationTimestamp="2026-02-27 00:48:05 +0000 UTC" firstStartedPulling="2026-02-27 00:48:07.694047167 +0000 UTC m=+2556.951589661" lastFinishedPulling="2026-02-27 00:48:10.111211831 +0000 UTC m=+2559.368751385" observedRunningTime="2026-02-27 00:48:10.745484058 +0000 UTC m=+2560.003023622" watchObservedRunningTime="2026-02-27 00:48:10.750224573 +0000 UTC m=+2560.007764127" Feb 27 00:48:14 crc kubenswrapper[4781]: I0227 00:48:14.309575 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:48:14 crc kubenswrapper[4781]: E0227 00:48:14.310536 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:48:16 crc kubenswrapper[4781]: I0227 00:48:16.212131 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:16 crc kubenswrapper[4781]: I0227 00:48:16.212492 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:16 crc kubenswrapper[4781]: I0227 00:48:16.264747 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:16 crc kubenswrapper[4781]: I0227 00:48:16.825651 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:16 crc kubenswrapper[4781]: I0227 00:48:16.873114 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpfn4"] Feb 27 00:48:18 crc kubenswrapper[4781]: I0227 00:48:18.798579 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vpfn4" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="registry-server" containerID="cri-o://7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac" gracePeriod=2 Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.562993 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.640411 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f97rp\" (UniqueName: \"kubernetes.io/projected/577f63dd-8b20-434a-ae9b-3d9589f08ccf-kube-api-access-f97rp\") pod \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.640548 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-catalog-content\") pod \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.640709 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-utilities\") pod \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.642401 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-utilities" (OuterVolumeSpecName: "utilities") pod "577f63dd-8b20-434a-ae9b-3d9589f08ccf" (UID: "577f63dd-8b20-434a-ae9b-3d9589f08ccf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.651564 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/577f63dd-8b20-434a-ae9b-3d9589f08ccf-kube-api-access-f97rp" (OuterVolumeSpecName: "kube-api-access-f97rp") pod "577f63dd-8b20-434a-ae9b-3d9589f08ccf" (UID: "577f63dd-8b20-434a-ae9b-3d9589f08ccf"). InnerVolumeSpecName "kube-api-access-f97rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.668274 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "577f63dd-8b20-434a-ae9b-3d9589f08ccf" (UID: "577f63dd-8b20-434a-ae9b-3d9589f08ccf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.742924 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.742963 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f97rp\" (UniqueName: \"kubernetes.io/projected/577f63dd-8b20-434a-ae9b-3d9589f08ccf-kube-api-access-f97rp\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.742978 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.811505 4781 generic.go:334] "Generic (PLEG): container finished" podID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerID="7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac" exitCode=0 Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.811559 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpfn4" event={"ID":"577f63dd-8b20-434a-ae9b-3d9589f08ccf","Type":"ContainerDied","Data":"7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac"} Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.811595 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.811618 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpfn4" event={"ID":"577f63dd-8b20-434a-ae9b-3d9589f08ccf","Type":"ContainerDied","Data":"b9571a6077784c10dc10f222b8867d8ccc892e934abbc12de5aadb40b3104057"} Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.811653 4781 scope.go:117] "RemoveContainer" containerID="7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.834002 4781 scope.go:117] "RemoveContainer" containerID="66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.864330 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpfn4"] Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.881288 4781 scope.go:117] "RemoveContainer" containerID="bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.883080 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpfn4"] Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.925960 4781 scope.go:117] "RemoveContainer" containerID="7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac" Feb 27 00:48:19 crc kubenswrapper[4781]: E0227 00:48:19.926477 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac\": container with ID starting with 7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac not found: ID does not exist" containerID="7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.926507 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac"} err="failed to get container status \"7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac\": rpc error: code = NotFound desc = could not find container \"7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac\": container with ID starting with 7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac not found: ID does not exist" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.926533 4781 scope.go:117] "RemoveContainer" containerID="66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c" Feb 27 00:48:19 crc kubenswrapper[4781]: E0227 00:48:19.926870 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c\": container with ID starting with 66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c not found: ID does not exist" containerID="66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.926920 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c"} err="failed to get container status \"66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c\": rpc error: code = NotFound desc = could not find container \"66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c\": container with ID starting with 66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c not found: ID does not exist" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.926951 4781 scope.go:117] "RemoveContainer" containerID="bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260" Feb 27 00:48:19 crc kubenswrapper[4781]: E0227 00:48:19.927302 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260\": container with ID starting with bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260 not found: ID does not exist" containerID="bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.927368 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260"} err="failed to get container status \"bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260\": rpc error: code = NotFound desc = could not find container \"bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260\": container with ID starting with bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260 not found: ID does not exist" Feb 27 00:48:21 crc kubenswrapper[4781]: I0227 00:48:21.320514 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" path="/var/lib/kubelet/pods/577f63dd-8b20-434a-ae9b-3d9589f08ccf/volumes" Feb 27 00:48:28 crc kubenswrapper[4781]: I0227 00:48:28.311280 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:48:28 crc kubenswrapper[4781]: E0227 00:48:28.312919 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:48:31 crc kubenswrapper[4781]: I0227 00:48:31.482681 4781 scope.go:117] "RemoveContainer" containerID="bcc82c4ff93196fe9d1d81964a39e384053e68533a13a500ed58309dd14ee8eb" Feb 27 00:48:39 crc kubenswrapper[4781]: I0227 00:48:39.309651 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:48:39 crc kubenswrapper[4781]: E0227 00:48:39.310433 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:48:42 crc kubenswrapper[4781]: I0227 00:48:42.224266 4781 generic.go:334] "Generic (PLEG): container finished" podID="d3f8abc3-17b4-4d88-890e-85304a100a97" containerID="d3e6c31e59c8273a4822b6ba92413f35b26f4d3e1b11014494798bec77bd763c" exitCode=0 Feb 27 00:48:42 crc kubenswrapper[4781]: I0227 00:48:42.224327 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" event={"ID":"d3f8abc3-17b4-4d88-890e-85304a100a97","Type":"ContainerDied","Data":"d3e6c31e59c8273a4822b6ba92413f35b26f4d3e1b11014494798bec77bd763c"} Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.803310 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.899950 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-1\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900024 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-extra-config-0\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900067 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-1\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900110 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-0\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900214 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-ssh-key-openstack-edpm-ipam\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900335 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-0\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900374 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-3\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900404 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-2\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900499 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7qk5\" (UniqueName: \"kubernetes.io/projected/d3f8abc3-17b4-4d88-890e-85304a100a97-kube-api-access-n7qk5\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900556 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-combined-ca-bundle\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900608 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-inventory\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.912097 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f8abc3-17b4-4d88-890e-85304a100a97-kube-api-access-n7qk5" (OuterVolumeSpecName: "kube-api-access-n7qk5") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "kube-api-access-n7qk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.933982 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.944197 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.948183 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.955846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.957916 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.958403 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.971544 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-inventory" (OuterVolumeSpecName: "inventory") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.981454 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.981506 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.982846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.002923 4781 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.002973 4781 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.002987 4781 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.002999 4781 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003012 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003024 4781 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003036 4781 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003049 4781 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003063 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7qk5\" (UniqueName: \"kubernetes.io/projected/d3f8abc3-17b4-4d88-890e-85304a100a97-kube-api-access-n7qk5\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003075 4781 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003087 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.246337 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" event={"ID":"d3f8abc3-17b4-4d88-890e-85304a100a97","Type":"ContainerDied","Data":"0f40d07d67261eda4bba8df0dd754f507383635699da4a2039a40542ef874ffe"} Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.246704 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f40d07d67261eda4bba8df0dd754f507383635699da4a2039a40542ef874ffe" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.246424 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.357847 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs"] Feb 27 00:48:44 crc kubenswrapper[4781]: E0227 00:48:44.358295 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="extract-content" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.358319 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="extract-content" Feb 27 00:48:44 crc kubenswrapper[4781]: E0227 00:48:44.358341 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="extract-utilities" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.358347 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="extract-utilities" Feb 27 00:48:44 crc kubenswrapper[4781]: E0227 00:48:44.358375 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="registry-server" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.358383 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="registry-server" Feb 27 00:48:44 crc kubenswrapper[4781]: E0227 00:48:44.358395 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f8abc3-17b4-4d88-890e-85304a100a97" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.358401 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f8abc3-17b4-4d88-890e-85304a100a97" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.358578 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="registry-server" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.358594 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f8abc3-17b4-4d88-890e-85304a100a97" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.359959 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.361856 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.362027 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.365370 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.365898 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.366127 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.378477 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs"] Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.410787 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.410853 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.410891 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.411064 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78gjh\" (UniqueName: \"kubernetes.io/projected/7a6c3903-7dfd-49cd-a92f-d138e10db404-kube-api-access-78gjh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.411241 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.411455 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.411598 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.515095 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.515962 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.516036 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.516115 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.516151 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.516194 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.516220 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78gjh\" (UniqueName: \"kubernetes.io/projected/7a6c3903-7dfd-49cd-a92f-d138e10db404-kube-api-access-78gjh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.521211 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.521225 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.521284 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.521604 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.521785 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.532074 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.537262 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78gjh\" (UniqueName: \"kubernetes.io/projected/7a6c3903-7dfd-49cd-a92f-d138e10db404-kube-api-access-78gjh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.679868 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:45 crc kubenswrapper[4781]: I0227 00:48:45.227927 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs"] Feb 27 00:48:45 crc kubenswrapper[4781]: I0227 00:48:45.257510 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" event={"ID":"7a6c3903-7dfd-49cd-a92f-d138e10db404","Type":"ContainerStarted","Data":"c98c73db13c2e6f240e777de96c450d5ef1c4ef457d610c978e3c63e24c6b834"} Feb 27 00:48:46 crc kubenswrapper[4781]: I0227 00:48:46.267700 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" event={"ID":"7a6c3903-7dfd-49cd-a92f-d138e10db404","Type":"ContainerStarted","Data":"892f0bf5e76001c655bd1216bacfc80b10ec06394101b3e897d30710d368bcae"} Feb 27 00:48:46 crc kubenswrapper[4781]: I0227 00:48:46.286466 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" podStartSLOduration=1.7288784750000001 podStartE2EDuration="2.286450882s" podCreationTimestamp="2026-02-27 00:48:44 +0000 UTC" firstStartedPulling="2026-02-27 00:48:45.230210819 +0000 UTC m=+2594.487750373" lastFinishedPulling="2026-02-27 00:48:45.787783236 +0000 UTC m=+2595.045322780" observedRunningTime="2026-02-27 00:48:46.283196867 +0000 UTC m=+2595.540736431" watchObservedRunningTime="2026-02-27 00:48:46.286450882 +0000 UTC m=+2595.543990436" Feb 27 00:48:51 crc kubenswrapper[4781]: I0227 00:48:51.316446 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:48:51 crc kubenswrapper[4781]: E0227 00:48:51.317273 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:49:06 crc kubenswrapper[4781]: I0227 00:49:06.309999 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:49:06 crc kubenswrapper[4781]: E0227 00:49:06.311017 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:49:21 crc kubenswrapper[4781]: I0227 00:49:21.316763 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:49:21 crc kubenswrapper[4781]: E0227 00:49:21.317660 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:49:35 crc kubenswrapper[4781]: I0227 00:49:35.309476 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:49:35 crc kubenswrapper[4781]: E0227 00:49:35.310293 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:49:47 crc kubenswrapper[4781]: I0227 00:49:47.309366 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:49:47 crc kubenswrapper[4781]: E0227 00:49:47.310162 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.145257 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535890-nv49g"] Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.147271 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.150364 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.150615 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.150765 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.155959 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535890-nv49g"] Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.267445 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t45t\" (UniqueName: \"kubernetes.io/projected/88fd3abb-2996-49d0-851b-41e0040438fa-kube-api-access-4t45t\") pod \"auto-csr-approver-29535890-nv49g\" (UID: \"88fd3abb-2996-49d0-851b-41e0040438fa\") " pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.309539 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:50:00 crc kubenswrapper[4781]: E0227 00:50:00.309930 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.370216 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t45t\" (UniqueName: \"kubernetes.io/projected/88fd3abb-2996-49d0-851b-41e0040438fa-kube-api-access-4t45t\") pod \"auto-csr-approver-29535890-nv49g\" (UID: \"88fd3abb-2996-49d0-851b-41e0040438fa\") " pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.391453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t45t\" (UniqueName: \"kubernetes.io/projected/88fd3abb-2996-49d0-851b-41e0040438fa-kube-api-access-4t45t\") pod \"auto-csr-approver-29535890-nv49g\" (UID: \"88fd3abb-2996-49d0-851b-41e0040438fa\") " pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.477419 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.985843 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535890-nv49g"] Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.997468 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535890-nv49g" event={"ID":"88fd3abb-2996-49d0-851b-41e0040438fa","Type":"ContainerStarted","Data":"c7b350d60c82e6d7b71728768683583c947111cd6ca7fe9cb3ff82449ef63dbf"} Feb 27 00:50:03 crc kubenswrapper[4781]: I0227 00:50:03.018232 4781 generic.go:334] "Generic (PLEG): container finished" podID="88fd3abb-2996-49d0-851b-41e0040438fa" containerID="f2bb4ab5a55c1440d2f1c4f2cba63824f23a7c027f89afef51d965a575920c2b" exitCode=0 Feb 27 00:50:03 crc kubenswrapper[4781]: I0227 00:50:03.018309 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535890-nv49g" event={"ID":"88fd3abb-2996-49d0-851b-41e0040438fa","Type":"ContainerDied","Data":"f2bb4ab5a55c1440d2f1c4f2cba63824f23a7c027f89afef51d965a575920c2b"} Feb 27 00:50:04 crc kubenswrapper[4781]: I0227 00:50:04.430616 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:04 crc kubenswrapper[4781]: I0227 00:50:04.561548 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t45t\" (UniqueName: \"kubernetes.io/projected/88fd3abb-2996-49d0-851b-41e0040438fa-kube-api-access-4t45t\") pod \"88fd3abb-2996-49d0-851b-41e0040438fa\" (UID: \"88fd3abb-2996-49d0-851b-41e0040438fa\") " Feb 27 00:50:04 crc kubenswrapper[4781]: I0227 00:50:04.567374 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88fd3abb-2996-49d0-851b-41e0040438fa-kube-api-access-4t45t" (OuterVolumeSpecName: "kube-api-access-4t45t") pod "88fd3abb-2996-49d0-851b-41e0040438fa" (UID: "88fd3abb-2996-49d0-851b-41e0040438fa"). InnerVolumeSpecName "kube-api-access-4t45t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:50:04 crc kubenswrapper[4781]: I0227 00:50:04.664224 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t45t\" (UniqueName: \"kubernetes.io/projected/88fd3abb-2996-49d0-851b-41e0040438fa-kube-api-access-4t45t\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:05 crc kubenswrapper[4781]: I0227 00:50:05.041507 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535890-nv49g" event={"ID":"88fd3abb-2996-49d0-851b-41e0040438fa","Type":"ContainerDied","Data":"c7b350d60c82e6d7b71728768683583c947111cd6ca7fe9cb3ff82449ef63dbf"} Feb 27 00:50:05 crc kubenswrapper[4781]: I0227 00:50:05.041553 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7b350d60c82e6d7b71728768683583c947111cd6ca7fe9cb3ff82449ef63dbf" Feb 27 00:50:05 crc kubenswrapper[4781]: I0227 00:50:05.041611 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:05 crc kubenswrapper[4781]: I0227 00:50:05.505284 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535884-8t2lb"] Feb 27 00:50:05 crc kubenswrapper[4781]: I0227 00:50:05.515428 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535884-8t2lb"] Feb 27 00:50:07 crc kubenswrapper[4781]: I0227 00:50:07.319949 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="018f4ff5-f081-4257-8189-3eb14ea035f3" path="/var/lib/kubelet/pods/018f4ff5-f081-4257-8189-3eb14ea035f3/volumes" Feb 27 00:50:13 crc kubenswrapper[4781]: I0227 00:50:13.311882 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:50:13 crc kubenswrapper[4781]: E0227 00:50:13.329365 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.507088 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qswm5"] Feb 27 00:50:24 crc kubenswrapper[4781]: E0227 00:50:24.509315 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88fd3abb-2996-49d0-851b-41e0040438fa" containerName="oc" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.509413 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="88fd3abb-2996-49d0-851b-41e0040438fa" containerName="oc" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.509795 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="88fd3abb-2996-49d0-851b-41e0040438fa" containerName="oc" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.511884 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.522431 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qswm5"] Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.585615 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtf8b\" (UniqueName: \"kubernetes.io/projected/e55472e5-5c75-4b68-9c22-dcf37baffe6a-kube-api-access-mtf8b\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.585673 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-utilities\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.585784 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-catalog-content\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.688257 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtf8b\" (UniqueName: \"kubernetes.io/projected/e55472e5-5c75-4b68-9c22-dcf37baffe6a-kube-api-access-mtf8b\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.688320 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-utilities\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.688441 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-catalog-content\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.689112 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-utilities\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.689147 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-catalog-content\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.711096 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtf8b\" (UniqueName: \"kubernetes.io/projected/e55472e5-5c75-4b68-9c22-dcf37baffe6a-kube-api-access-mtf8b\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.839197 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:25 crc kubenswrapper[4781]: I0227 00:50:25.313201 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:50:25 crc kubenswrapper[4781]: E0227 00:50:25.314117 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:50:25 crc kubenswrapper[4781]: I0227 00:50:25.380982 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qswm5"] Feb 27 00:50:26 crc kubenswrapper[4781]: I0227 00:50:26.251805 4781 generic.go:334] "Generic (PLEG): container finished" podID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerID="ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74" exitCode=0 Feb 27 00:50:26 crc kubenswrapper[4781]: I0227 00:50:26.251946 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerDied","Data":"ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74"} Feb 27 00:50:26 crc kubenswrapper[4781]: I0227 00:50:26.252228 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerStarted","Data":"ec42b700956099171afc0f35a3868f09a1f40ac9aa51906b323a60c042870bbe"} Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.697340 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xlssc"] Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.706425 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.712229 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xlssc"] Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.856809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-catalog-content\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.856909 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcb47\" (UniqueName: \"kubernetes.io/projected/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-kube-api-access-pcb47\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.856953 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-utilities\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.959273 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcb47\" (UniqueName: \"kubernetes.io/projected/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-kube-api-access-pcb47\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.960067 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-utilities\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.960484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-utilities\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.960675 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-catalog-content\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.960926 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-catalog-content\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.986341 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcb47\" (UniqueName: \"kubernetes.io/projected/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-kube-api-access-pcb47\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:28 crc kubenswrapper[4781]: I0227 00:50:28.028883 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:28 crc kubenswrapper[4781]: I0227 00:50:28.274926 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerStarted","Data":"4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269"} Feb 27 00:50:28 crc kubenswrapper[4781]: I0227 00:50:28.564899 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xlssc"] Feb 27 00:50:28 crc kubenswrapper[4781]: W0227 00:50:28.570027 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f4ba98e_f6f5_41cc_8618_22dfb8700b4c.slice/crio-556870de211a67a576b43464c6f861f745bcebf6538ad4c84adeb7c03d8872b7 WatchSource:0}: Error finding container 556870de211a67a576b43464c6f861f745bcebf6538ad4c84adeb7c03d8872b7: Status 404 returned error can't find the container with id 556870de211a67a576b43464c6f861f745bcebf6538ad4c84adeb7c03d8872b7 Feb 27 00:50:29 crc kubenswrapper[4781]: I0227 00:50:29.286560 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerID="e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613" exitCode=0 Feb 27 00:50:29 crc kubenswrapper[4781]: I0227 00:50:29.286666 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerDied","Data":"e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613"} Feb 27 00:50:29 crc kubenswrapper[4781]: I0227 00:50:29.286996 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerStarted","Data":"556870de211a67a576b43464c6f861f745bcebf6538ad4c84adeb7c03d8872b7"} Feb 27 00:50:30 crc kubenswrapper[4781]: I0227 00:50:30.299595 4781 generic.go:334] "Generic (PLEG): container finished" podID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerID="4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269" exitCode=0 Feb 27 00:50:30 crc kubenswrapper[4781]: I0227 00:50:30.299674 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerDied","Data":"4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269"} Feb 27 00:50:31 crc kubenswrapper[4781]: I0227 00:50:31.323959 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerStarted","Data":"448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6"} Feb 27 00:50:31 crc kubenswrapper[4781]: I0227 00:50:31.324275 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerStarted","Data":"f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9"} Feb 27 00:50:31 crc kubenswrapper[4781]: I0227 00:50:31.363346 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qswm5" podStartSLOduration=2.875918501 podStartE2EDuration="7.363327507s" podCreationTimestamp="2026-02-27 00:50:24 +0000 UTC" firstStartedPulling="2026-02-27 00:50:26.25408046 +0000 UTC m=+2695.511620014" lastFinishedPulling="2026-02-27 00:50:30.741489466 +0000 UTC m=+2699.999029020" observedRunningTime="2026-02-27 00:50:31.35700478 +0000 UTC m=+2700.614544334" watchObservedRunningTime="2026-02-27 00:50:31.363327507 +0000 UTC m=+2700.620867061" Feb 27 00:50:31 crc kubenswrapper[4781]: I0227 00:50:31.588831 4781 scope.go:117] "RemoveContainer" containerID="26e013582f5ee2e314ebc2f4329b87db88bd3251fee9e3e932b5b02ee387f73b" Feb 27 00:50:34 crc kubenswrapper[4781]: I0227 00:50:34.839649 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:34 crc kubenswrapper[4781]: I0227 00:50:34.840398 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:35 crc kubenswrapper[4781]: I0227 00:50:35.897936 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qswm5" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="registry-server" probeResult="failure" output=< Feb 27 00:50:35 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:50:35 crc kubenswrapper[4781]: > Feb 27 00:50:36 crc kubenswrapper[4781]: I0227 00:50:36.371485 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerID="f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9" exitCode=0 Feb 27 00:50:36 crc kubenswrapper[4781]: I0227 00:50:36.371534 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerDied","Data":"f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9"} Feb 27 00:50:37 crc kubenswrapper[4781]: I0227 00:50:37.407086 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerStarted","Data":"137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9"} Feb 27 00:50:37 crc kubenswrapper[4781]: I0227 00:50:37.433451 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xlssc" podStartSLOduration=2.865682895 podStartE2EDuration="10.433430393s" podCreationTimestamp="2026-02-27 00:50:27 +0000 UTC" firstStartedPulling="2026-02-27 00:50:29.288601725 +0000 UTC m=+2698.546141279" lastFinishedPulling="2026-02-27 00:50:36.856349233 +0000 UTC m=+2706.113888777" observedRunningTime="2026-02-27 00:50:37.427495607 +0000 UTC m=+2706.685035181" watchObservedRunningTime="2026-02-27 00:50:37.433430393 +0000 UTC m=+2706.690969947" Feb 27 00:50:38 crc kubenswrapper[4781]: I0227 00:50:38.029995 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:38 crc kubenswrapper[4781]: I0227 00:50:38.030234 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:39 crc kubenswrapper[4781]: I0227 00:50:39.080089 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xlssc" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="registry-server" probeResult="failure" output=< Feb 27 00:50:39 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:50:39 crc kubenswrapper[4781]: > Feb 27 00:50:39 crc kubenswrapper[4781]: I0227 00:50:39.310189 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:50:39 crc kubenswrapper[4781]: E0227 00:50:39.310542 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:50:45 crc kubenswrapper[4781]: I0227 00:50:45.894064 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qswm5" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="registry-server" probeResult="failure" output=< Feb 27 00:50:45 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:50:45 crc kubenswrapper[4781]: > Feb 27 00:50:48 crc kubenswrapper[4781]: I0227 00:50:48.076789 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:48 crc kubenswrapper[4781]: I0227 00:50:48.129812 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:48 crc kubenswrapper[4781]: I0227 00:50:48.312664 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xlssc"] Feb 27 00:50:49 crc kubenswrapper[4781]: I0227 00:50:49.510167 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xlssc" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="registry-server" containerID="cri-o://137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9" gracePeriod=2 Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.133401 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.264523 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-catalog-content\") pod \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.264566 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-utilities\") pod \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.264678 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcb47\" (UniqueName: \"kubernetes.io/projected/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-kube-api-access-pcb47\") pod \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.265528 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-utilities" (OuterVolumeSpecName: "utilities") pod "3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" (UID: "3f4ba98e-f6f5-41cc-8618-22dfb8700b4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.272818 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-kube-api-access-pcb47" (OuterVolumeSpecName: "kube-api-access-pcb47") pod "3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" (UID: "3f4ba98e-f6f5-41cc-8618-22dfb8700b4c"). InnerVolumeSpecName "kube-api-access-pcb47". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.367214 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.367252 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcb47\" (UniqueName: \"kubernetes.io/projected/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-kube-api-access-pcb47\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.409889 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" (UID: "3f4ba98e-f6f5-41cc-8618-22dfb8700b4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.469893 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.520795 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerID="137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9" exitCode=0 Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.520844 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.520856 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerDied","Data":"137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9"} Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.522181 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerDied","Data":"556870de211a67a576b43464c6f861f745bcebf6538ad4c84adeb7c03d8872b7"} Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.522257 4781 scope.go:117] "RemoveContainer" containerID="137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.556271 4781 scope.go:117] "RemoveContainer" containerID="f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.583467 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xlssc"] Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.593863 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xlssc"] Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.709837 4781 scope.go:117] "RemoveContainer" containerID="e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.782790 4781 scope.go:117] "RemoveContainer" containerID="137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9" Feb 27 00:50:50 crc kubenswrapper[4781]: E0227 00:50:50.790869 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9\": container with ID starting with 137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9 not found: ID does not exist" containerID="137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.790919 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9"} err="failed to get container status \"137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9\": rpc error: code = NotFound desc = could not find container \"137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9\": container with ID starting with 137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9 not found: ID does not exist" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.790946 4781 scope.go:117] "RemoveContainer" containerID="f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9" Feb 27 00:50:50 crc kubenswrapper[4781]: E0227 00:50:50.792867 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9\": container with ID starting with f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9 not found: ID does not exist" containerID="f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.792923 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9"} err="failed to get container status \"f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9\": rpc error: code = NotFound desc = could not find container \"f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9\": container with ID starting with f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9 not found: ID does not exist" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.792966 4781 scope.go:117] "RemoveContainer" containerID="e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613" Feb 27 00:50:50 crc kubenswrapper[4781]: E0227 00:50:50.793234 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613\": container with ID starting with e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613 not found: ID does not exist" containerID="e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.793256 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613"} err="failed to get container status \"e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613\": rpc error: code = NotFound desc = could not find container \"e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613\": container with ID starting with e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613 not found: ID does not exist" Feb 27 00:50:51 crc kubenswrapper[4781]: I0227 00:50:51.315615 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:50:51 crc kubenswrapper[4781]: E0227 00:50:51.315899 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:50:51 crc kubenswrapper[4781]: I0227 00:50:51.320865 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" path="/var/lib/kubelet/pods/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c/volumes" Feb 27 00:50:51 crc kubenswrapper[4781]: I0227 00:50:51.536420 4781 generic.go:334] "Generic (PLEG): container finished" podID="7a6c3903-7dfd-49cd-a92f-d138e10db404" containerID="892f0bf5e76001c655bd1216bacfc80b10ec06394101b3e897d30710d368bcae" exitCode=0 Feb 27 00:50:51 crc kubenswrapper[4781]: I0227 00:50:51.536470 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" event={"ID":"7a6c3903-7dfd-49cd-a92f-d138e10db404","Type":"ContainerDied","Data":"892f0bf5e76001c655bd1216bacfc80b10ec06394101b3e897d30710d368bcae"} Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.090903 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136739 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ssh-key-openstack-edpm-ipam\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136793 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-2\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136820 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-inventory\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136874 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-0\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136899 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-telemetry-combined-ca-bundle\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136920 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78gjh\" (UniqueName: \"kubernetes.io/projected/7a6c3903-7dfd-49cd-a92f-d138e10db404-kube-api-access-78gjh\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136997 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-1\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.152810 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.161855 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a6c3903-7dfd-49cd-a92f-d138e10db404-kube-api-access-78gjh" (OuterVolumeSpecName: "kube-api-access-78gjh") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "kube-api-access-78gjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.197014 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-inventory" (OuterVolumeSpecName: "inventory") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.242102 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.242134 4781 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.242146 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78gjh\" (UniqueName: \"kubernetes.io/projected/7a6c3903-7dfd-49cd-a92f-d138e10db404-kube-api-access-78gjh\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.242361 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.280127 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.297489 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.297724 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.350028 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.350075 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.350087 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.350099 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.556442 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" event={"ID":"7a6c3903-7dfd-49cd-a92f-d138e10db404","Type":"ContainerDied","Data":"c98c73db13c2e6f240e777de96c450d5ef1c4ef457d610c978e3c63e24c6b834"} Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.556487 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c98c73db13c2e6f240e777de96c450d5ef1c4ef457d610c978e3c63e24c6b834" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.556512 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:50:54 crc kubenswrapper[4781]: I0227 00:50:54.898477 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:54 crc kubenswrapper[4781]: I0227 00:50:54.957546 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:55 crc kubenswrapper[4781]: I0227 00:50:55.712986 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qswm5"] Feb 27 00:50:56 crc kubenswrapper[4781]: I0227 00:50:56.582362 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qswm5" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="registry-server" containerID="cri-o://448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6" gracePeriod=2 Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.140597 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.227140 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-utilities\") pod \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.227255 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtf8b\" (UniqueName: \"kubernetes.io/projected/e55472e5-5c75-4b68-9c22-dcf37baffe6a-kube-api-access-mtf8b\") pod \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.227562 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-catalog-content\") pod \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.229130 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-utilities" (OuterVolumeSpecName: "utilities") pod "e55472e5-5c75-4b68-9c22-dcf37baffe6a" (UID: "e55472e5-5c75-4b68-9c22-dcf37baffe6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.237872 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e55472e5-5c75-4b68-9c22-dcf37baffe6a-kube-api-access-mtf8b" (OuterVolumeSpecName: "kube-api-access-mtf8b") pod "e55472e5-5c75-4b68-9c22-dcf37baffe6a" (UID: "e55472e5-5c75-4b68-9c22-dcf37baffe6a"). InnerVolumeSpecName "kube-api-access-mtf8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.284924 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e55472e5-5c75-4b68-9c22-dcf37baffe6a" (UID: "e55472e5-5c75-4b68-9c22-dcf37baffe6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.333467 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.333504 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.333515 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtf8b\" (UniqueName: \"kubernetes.io/projected/e55472e5-5c75-4b68-9c22-dcf37baffe6a-kube-api-access-mtf8b\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.593011 4781 generic.go:334] "Generic (PLEG): container finished" podID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerID="448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6" exitCode=0 Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.593054 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.593061 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerDied","Data":"448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6"} Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.593090 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerDied","Data":"ec42b700956099171afc0f35a3868f09a1f40ac9aa51906b323a60c042870bbe"} Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.593111 4781 scope.go:117] "RemoveContainer" containerID="448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.614450 4781 scope.go:117] "RemoveContainer" containerID="4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.620323 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qswm5"] Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.632058 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qswm5"] Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.633898 4781 scope.go:117] "RemoveContainer" containerID="ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.680852 4781 scope.go:117] "RemoveContainer" containerID="448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6" Feb 27 00:50:57 crc kubenswrapper[4781]: E0227 00:50:57.682438 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6\": container with ID starting with 448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6 not found: ID does not exist" containerID="448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.682474 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6"} err="failed to get container status \"448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6\": rpc error: code = NotFound desc = could not find container \"448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6\": container with ID starting with 448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6 not found: ID does not exist" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.682494 4781 scope.go:117] "RemoveContainer" containerID="4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269" Feb 27 00:50:57 crc kubenswrapper[4781]: E0227 00:50:57.685003 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269\": container with ID starting with 4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269 not found: ID does not exist" containerID="4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.685063 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269"} err="failed to get container status \"4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269\": rpc error: code = NotFound desc = could not find container \"4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269\": container with ID starting with 4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269 not found: ID does not exist" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.685093 4781 scope.go:117] "RemoveContainer" containerID="ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74" Feb 27 00:50:57 crc kubenswrapper[4781]: E0227 00:50:57.686588 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74\": container with ID starting with ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74 not found: ID does not exist" containerID="ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.686646 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74"} err="failed to get container status \"ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74\": rpc error: code = NotFound desc = could not find container \"ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74\": container with ID starting with ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74 not found: ID does not exist" Feb 27 00:50:59 crc kubenswrapper[4781]: I0227 00:50:59.321824 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" path="/var/lib/kubelet/pods/e55472e5-5c75-4b68-9c22-dcf37baffe6a/volumes" Feb 27 00:51:06 crc kubenswrapper[4781]: I0227 00:51:06.310073 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:51:06 crc kubenswrapper[4781]: E0227 00:51:06.310834 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:51:20 crc kubenswrapper[4781]: I0227 00:51:20.309953 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:51:20 crc kubenswrapper[4781]: E0227 00:51:20.310735 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:51:31 crc kubenswrapper[4781]: I0227 00:51:31.319509 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:51:31 crc kubenswrapper[4781]: E0227 00:51:31.320374 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:51:44 crc kubenswrapper[4781]: I0227 00:51:44.309488 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:51:45 crc kubenswrapper[4781]: I0227 00:51:45.007072 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"9eb13c9d0480acfdc7ab15c203347b34a63e0504efc9127264d926b2dd0b3a20"} Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.162294 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535892-5kr7b"] Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163363 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="extract-content" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163379 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="extract-content" Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163412 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="extract-utilities" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163419 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="extract-utilities" Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163432 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="registry-server" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163438 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="registry-server" Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163450 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="extract-utilities" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163456 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="extract-utilities" Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163467 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="registry-server" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163473 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="registry-server" Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163480 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a6c3903-7dfd-49cd-a92f-d138e10db404" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163487 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a6c3903-7dfd-49cd-a92f-d138e10db404" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163503 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="extract-content" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163508 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="extract-content" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163743 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a6c3903-7dfd-49cd-a92f-d138e10db404" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163768 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="registry-server" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163778 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="registry-server" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.164596 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.167280 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.167726 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.167887 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.175252 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535892-5kr7b"] Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.270144 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhdm6\" (UniqueName: \"kubernetes.io/projected/7eb81f10-0ea8-4376-b588-a3d9462c0bc4-kube-api-access-dhdm6\") pod \"auto-csr-approver-29535892-5kr7b\" (UID: \"7eb81f10-0ea8-4376-b588-a3d9462c0bc4\") " pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.372668 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhdm6\" (UniqueName: \"kubernetes.io/projected/7eb81f10-0ea8-4376-b588-a3d9462c0bc4-kube-api-access-dhdm6\") pod \"auto-csr-approver-29535892-5kr7b\" (UID: \"7eb81f10-0ea8-4376-b588-a3d9462c0bc4\") " pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.395377 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhdm6\" (UniqueName: \"kubernetes.io/projected/7eb81f10-0ea8-4376-b588-a3d9462c0bc4-kube-api-access-dhdm6\") pod \"auto-csr-approver-29535892-5kr7b\" (UID: \"7eb81f10-0ea8-4376-b588-a3d9462c0bc4\") " pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.489272 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:01 crc kubenswrapper[4781]: I0227 00:52:01.007508 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535892-5kr7b"] Feb 27 00:52:01 crc kubenswrapper[4781]: I0227 00:52:01.161580 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" event={"ID":"7eb81f10-0ea8-4376-b588-a3d9462c0bc4","Type":"ContainerStarted","Data":"81f5d9ffd1253c0fe66074826a1be97377eb0943041ec8404b90de2bb8cd82b6"} Feb 27 00:52:03 crc kubenswrapper[4781]: I0227 00:52:03.182091 4781 generic.go:334] "Generic (PLEG): container finished" podID="7eb81f10-0ea8-4376-b588-a3d9462c0bc4" containerID="f073e5337bf518b81829559352fcea1859d4a5ced7771a4c11f45807c039ab0a" exitCode=0 Feb 27 00:52:03 crc kubenswrapper[4781]: I0227 00:52:03.182188 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" event={"ID":"7eb81f10-0ea8-4376-b588-a3d9462c0bc4","Type":"ContainerDied","Data":"f073e5337bf518b81829559352fcea1859d4a5ced7771a4c11f45807c039ab0a"} Feb 27 00:52:04 crc kubenswrapper[4781]: I0227 00:52:04.680246 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:04 crc kubenswrapper[4781]: I0227 00:52:04.766555 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhdm6\" (UniqueName: \"kubernetes.io/projected/7eb81f10-0ea8-4376-b588-a3d9462c0bc4-kube-api-access-dhdm6\") pod \"7eb81f10-0ea8-4376-b588-a3d9462c0bc4\" (UID: \"7eb81f10-0ea8-4376-b588-a3d9462c0bc4\") " Feb 27 00:52:04 crc kubenswrapper[4781]: I0227 00:52:04.772056 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb81f10-0ea8-4376-b588-a3d9462c0bc4-kube-api-access-dhdm6" (OuterVolumeSpecName: "kube-api-access-dhdm6") pod "7eb81f10-0ea8-4376-b588-a3d9462c0bc4" (UID: "7eb81f10-0ea8-4376-b588-a3d9462c0bc4"). InnerVolumeSpecName "kube-api-access-dhdm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:52:04 crc kubenswrapper[4781]: I0227 00:52:04.869475 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhdm6\" (UniqueName: \"kubernetes.io/projected/7eb81f10-0ea8-4376-b588-a3d9462c0bc4-kube-api-access-dhdm6\") on node \"crc\" DevicePath \"\"" Feb 27 00:52:05 crc kubenswrapper[4781]: I0227 00:52:05.201767 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" event={"ID":"7eb81f10-0ea8-4376-b588-a3d9462c0bc4","Type":"ContainerDied","Data":"81f5d9ffd1253c0fe66074826a1be97377eb0943041ec8404b90de2bb8cd82b6"} Feb 27 00:52:05 crc kubenswrapper[4781]: I0227 00:52:05.202093 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81f5d9ffd1253c0fe66074826a1be97377eb0943041ec8404b90de2bb8cd82b6" Feb 27 00:52:05 crc kubenswrapper[4781]: I0227 00:52:05.201845 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:05 crc kubenswrapper[4781]: I0227 00:52:05.765322 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535886-5khtq"] Feb 27 00:52:05 crc kubenswrapper[4781]: I0227 00:52:05.775675 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535886-5khtq"] Feb 27 00:52:07 crc kubenswrapper[4781]: I0227 00:52:07.324387 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8143ddb0-990c-4f1e-9130-7ca30776e64b" path="/var/lib/kubelet/pods/8143ddb0-990c-4f1e-9130-7ca30776e64b/volumes" Feb 27 00:52:31 crc kubenswrapper[4781]: I0227 00:52:31.778666 4781 scope.go:117] "RemoveContainer" containerID="e0bb531ca8e9ee4c1a35ccb62422bfe50af2c334314f4bd145d5137b8ad741e6" Feb 27 00:52:39 crc kubenswrapper[4781]: I0227 00:52:39.931901 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sfnkp"] Feb 27 00:52:39 crc kubenswrapper[4781]: E0227 00:52:39.933375 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb81f10-0ea8-4376-b588-a3d9462c0bc4" containerName="oc" Feb 27 00:52:39 crc kubenswrapper[4781]: I0227 00:52:39.933393 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb81f10-0ea8-4376-b588-a3d9462c0bc4" containerName="oc" Feb 27 00:52:39 crc kubenswrapper[4781]: I0227 00:52:39.933599 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb81f10-0ea8-4376-b588-a3d9462c0bc4" containerName="oc" Feb 27 00:52:39 crc kubenswrapper[4781]: I0227 00:52:39.935299 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:39 crc kubenswrapper[4781]: I0227 00:52:39.943185 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sfnkp"] Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.061695 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-utilities\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.061809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-catalog-content\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.061856 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqwm8\" (UniqueName: \"kubernetes.io/projected/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-kube-api-access-sqwm8\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.163966 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-catalog-content\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.164037 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqwm8\" (UniqueName: \"kubernetes.io/projected/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-kube-api-access-sqwm8\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.164166 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-utilities\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.164674 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-utilities\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.164674 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-catalog-content\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.188886 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqwm8\" (UniqueName: \"kubernetes.io/projected/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-kube-api-access-sqwm8\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.255164 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.816896 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sfnkp"] Feb 27 00:52:41 crc kubenswrapper[4781]: I0227 00:52:41.579116 4781 generic.go:334] "Generic (PLEG): container finished" podID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerID="72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a" exitCode=0 Feb 27 00:52:41 crc kubenswrapper[4781]: I0227 00:52:41.579234 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerDied","Data":"72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a"} Feb 27 00:52:41 crc kubenswrapper[4781]: I0227 00:52:41.579420 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerStarted","Data":"3b315d0c2f14459090fc3e1cdd3ea378a65d651e1ca9cba884325f43c392d3da"} Feb 27 00:52:43 crc kubenswrapper[4781]: I0227 00:52:43.597617 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerStarted","Data":"8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698"} Feb 27 00:52:44 crc kubenswrapper[4781]: I0227 00:52:44.607859 4781 generic.go:334] "Generic (PLEG): container finished" podID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerID="8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698" exitCode=0 Feb 27 00:52:44 crc kubenswrapper[4781]: I0227 00:52:44.608062 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerDied","Data":"8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698"} Feb 27 00:52:45 crc kubenswrapper[4781]: I0227 00:52:45.625491 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerStarted","Data":"6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994"} Feb 27 00:52:45 crc kubenswrapper[4781]: I0227 00:52:45.654904 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sfnkp" podStartSLOduration=3.255425093 podStartE2EDuration="6.65487939s" podCreationTimestamp="2026-02-27 00:52:39 +0000 UTC" firstStartedPulling="2026-02-27 00:52:41.582681111 +0000 UTC m=+2830.840220675" lastFinishedPulling="2026-02-27 00:52:44.982135418 +0000 UTC m=+2834.239674972" observedRunningTime="2026-02-27 00:52:45.644929978 +0000 UTC m=+2834.902469522" watchObservedRunningTime="2026-02-27 00:52:45.65487939 +0000 UTC m=+2834.912418954" Feb 27 00:52:50 crc kubenswrapper[4781]: I0227 00:52:50.255566 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:50 crc kubenswrapper[4781]: I0227 00:52:50.255917 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:50 crc kubenswrapper[4781]: I0227 00:52:50.309593 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:50 crc kubenswrapper[4781]: I0227 00:52:50.721408 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:50 crc kubenswrapper[4781]: I0227 00:52:50.769753 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sfnkp"] Feb 27 00:52:52 crc kubenswrapper[4781]: I0227 00:52:52.692508 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sfnkp" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="registry-server" containerID="cri-o://6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994" gracePeriod=2 Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.362328 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.454017 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-catalog-content\") pod \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.455933 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqwm8\" (UniqueName: \"kubernetes.io/projected/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-kube-api-access-sqwm8\") pod \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.456050 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-utilities\") pod \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.459084 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-utilities" (OuterVolumeSpecName: "utilities") pod "d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" (UID: "d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.478501 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-kube-api-access-sqwm8" (OuterVolumeSpecName: "kube-api-access-sqwm8") pod "d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" (UID: "d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037"). InnerVolumeSpecName "kube-api-access-sqwm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.511979 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" (UID: "d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.560110 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.560364 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqwm8\" (UniqueName: \"kubernetes.io/projected/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-kube-api-access-sqwm8\") on node \"crc\" DevicePath \"\"" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.560448 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.704678 4781 generic.go:334] "Generic (PLEG): container finished" podID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerID="6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994" exitCode=0 Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.704799 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerDied","Data":"6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994"} Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.705057 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerDied","Data":"3b315d0c2f14459090fc3e1cdd3ea378a65d651e1ca9cba884325f43c392d3da"} Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.705083 4781 scope.go:117] "RemoveContainer" containerID="6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.704835 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.766340 4781 scope.go:117] "RemoveContainer" containerID="8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.777731 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sfnkp"] Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.789254 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sfnkp"] Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.821561 4781 scope.go:117] "RemoveContainer" containerID="72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.869408 4781 scope.go:117] "RemoveContainer" containerID="6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994" Feb 27 00:52:53 crc kubenswrapper[4781]: E0227 00:52:53.869909 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994\": container with ID starting with 6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994 not found: ID does not exist" containerID="6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.869946 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994"} err="failed to get container status \"6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994\": rpc error: code = NotFound desc = could not find container \"6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994\": container with ID starting with 6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994 not found: ID does not exist" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.869971 4781 scope.go:117] "RemoveContainer" containerID="8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698" Feb 27 00:52:53 crc kubenswrapper[4781]: E0227 00:52:53.870335 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698\": container with ID starting with 8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698 not found: ID does not exist" containerID="8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.870366 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698"} err="failed to get container status \"8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698\": rpc error: code = NotFound desc = could not find container \"8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698\": container with ID starting with 8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698 not found: ID does not exist" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.870385 4781 scope.go:117] "RemoveContainer" containerID="72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a" Feb 27 00:52:53 crc kubenswrapper[4781]: E0227 00:52:53.871243 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a\": container with ID starting with 72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a not found: ID does not exist" containerID="72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.871274 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a"} err="failed to get container status \"72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a\": rpc error: code = NotFound desc = could not find container \"72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a\": container with ID starting with 72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a not found: ID does not exist" Feb 27 00:52:55 crc kubenswrapper[4781]: I0227 00:52:55.321351 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" path="/var/lib/kubelet/pods/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037/volumes" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.153548 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535894-wdzjg"] Feb 27 00:54:00 crc kubenswrapper[4781]: E0227 00:54:00.157849 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="extract-content" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.157884 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="extract-content" Feb 27 00:54:00 crc kubenswrapper[4781]: E0227 00:54:00.157915 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="registry-server" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.157924 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="registry-server" Feb 27 00:54:00 crc kubenswrapper[4781]: E0227 00:54:00.157958 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="extract-utilities" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.157968 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="extract-utilities" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.158265 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="registry-server" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.159599 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.161993 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.162435 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.163606 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.175893 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535894-wdzjg"] Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.299151 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql7dj\" (UniqueName: \"kubernetes.io/projected/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa-kube-api-access-ql7dj\") pod \"auto-csr-approver-29535894-wdzjg\" (UID: \"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa\") " pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.403419 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql7dj\" (UniqueName: \"kubernetes.io/projected/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa-kube-api-access-ql7dj\") pod \"auto-csr-approver-29535894-wdzjg\" (UID: \"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa\") " pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.425770 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql7dj\" (UniqueName: \"kubernetes.io/projected/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa-kube-api-access-ql7dj\") pod \"auto-csr-approver-29535894-wdzjg\" (UID: \"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa\") " pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.483845 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.971573 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535894-wdzjg"] Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.978602 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:54:01 crc kubenswrapper[4781]: I0227 00:54:01.426514 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" event={"ID":"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa","Type":"ContainerStarted","Data":"02533eb2fb7ee8c0037fd145c3ab25c86e7b7f823eb3ca9cb44384e6c4541a14"} Feb 27 00:54:03 crc kubenswrapper[4781]: I0227 00:54:03.446832 4781 generic.go:334] "Generic (PLEG): container finished" podID="1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa" containerID="cd5eb21f935374fa744e81c6189b26e6ff6841a0ef882762f86735b2bdaec5ee" exitCode=0 Feb 27 00:54:03 crc kubenswrapper[4781]: I0227 00:54:03.446892 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" event={"ID":"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa","Type":"ContainerDied","Data":"cd5eb21f935374fa744e81c6189b26e6ff6841a0ef882762f86735b2bdaec5ee"} Feb 27 00:54:04 crc kubenswrapper[4781]: I0227 00:54:04.905619 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.012687 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql7dj\" (UniqueName: \"kubernetes.io/projected/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa-kube-api-access-ql7dj\") pod \"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa\" (UID: \"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa\") " Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.019211 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa-kube-api-access-ql7dj" (OuterVolumeSpecName: "kube-api-access-ql7dj") pod "1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa" (UID: "1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa"). InnerVolumeSpecName "kube-api-access-ql7dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.115150 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql7dj\" (UniqueName: \"kubernetes.io/projected/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa-kube-api-access-ql7dj\") on node \"crc\" DevicePath \"\"" Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.468436 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" event={"ID":"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa","Type":"ContainerDied","Data":"02533eb2fb7ee8c0037fd145c3ab25c86e7b7f823eb3ca9cb44384e6c4541a14"} Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.468473 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02533eb2fb7ee8c0037fd145c3ab25c86e7b7f823eb3ca9cb44384e6c4541a14" Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.468523 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.989730 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535888-nb28f"] Feb 27 00:54:06 crc kubenswrapper[4781]: I0227 00:54:06.001877 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535888-nb28f"] Feb 27 00:54:07 crc kubenswrapper[4781]: I0227 00:54:07.319719 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231f1edd-305c-4a6c-bd4e-11c12c2ae515" path="/var/lib/kubelet/pods/231f1edd-305c-4a6c-bd4e-11c12c2ae515/volumes" Feb 27 00:54:12 crc kubenswrapper[4781]: I0227 00:54:12.895006 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:54:12 crc kubenswrapper[4781]: I0227 00:54:12.895422 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:54:31 crc kubenswrapper[4781]: I0227 00:54:31.907429 4781 scope.go:117] "RemoveContainer" containerID="2518570ffdceb97ceb198f4ca24bb08d3d0c202488b87c6e1650891fc7084042" Feb 27 00:54:42 crc kubenswrapper[4781]: I0227 00:54:42.895459 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:54:42 crc kubenswrapper[4781]: I0227 00:54:42.896010 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:55:12 crc kubenswrapper[4781]: I0227 00:55:12.895195 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:55:12 crc kubenswrapper[4781]: I0227 00:55:12.895737 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:55:12 crc kubenswrapper[4781]: I0227 00:55:12.895778 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:55:12 crc kubenswrapper[4781]: I0227 00:55:12.896552 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9eb13c9d0480acfdc7ab15c203347b34a63e0504efc9127264d926b2dd0b3a20"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:55:12 crc kubenswrapper[4781]: I0227 00:55:12.896616 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://9eb13c9d0480acfdc7ab15c203347b34a63e0504efc9127264d926b2dd0b3a20" gracePeriod=600 Feb 27 00:55:13 crc kubenswrapper[4781]: I0227 00:55:13.089470 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="9eb13c9d0480acfdc7ab15c203347b34a63e0504efc9127264d926b2dd0b3a20" exitCode=0 Feb 27 00:55:13 crc kubenswrapper[4781]: I0227 00:55:13.089516 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"9eb13c9d0480acfdc7ab15c203347b34a63e0504efc9127264d926b2dd0b3a20"} Feb 27 00:55:13 crc kubenswrapper[4781]: I0227 00:55:13.089549 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:55:14 crc kubenswrapper[4781]: I0227 00:55:14.099604 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3"} Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.155762 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535896-46l2w"] Feb 27 00:56:00 crc kubenswrapper[4781]: E0227 00:56:00.156903 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa" containerName="oc" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.156922 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa" containerName="oc" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.157132 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa" containerName="oc" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.158094 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.161641 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.161662 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.165047 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.166443 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535896-46l2w"] Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.318056 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z7f8\" (UniqueName: \"kubernetes.io/projected/4c6b6160-b122-4248-b7ed-a206d3bc633e-kube-api-access-2z7f8\") pod \"auto-csr-approver-29535896-46l2w\" (UID: \"4c6b6160-b122-4248-b7ed-a206d3bc633e\") " pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.420614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z7f8\" (UniqueName: \"kubernetes.io/projected/4c6b6160-b122-4248-b7ed-a206d3bc633e-kube-api-access-2z7f8\") pod \"auto-csr-approver-29535896-46l2w\" (UID: \"4c6b6160-b122-4248-b7ed-a206d3bc633e\") " pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.447703 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z7f8\" (UniqueName: \"kubernetes.io/projected/4c6b6160-b122-4248-b7ed-a206d3bc633e-kube-api-access-2z7f8\") pod \"auto-csr-approver-29535896-46l2w\" (UID: \"4c6b6160-b122-4248-b7ed-a206d3bc633e\") " pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.478878 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.923089 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535896-46l2w"] Feb 27 00:56:01 crc kubenswrapper[4781]: I0227 00:56:01.132649 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535896-46l2w" event={"ID":"4c6b6160-b122-4248-b7ed-a206d3bc633e","Type":"ContainerStarted","Data":"218f3c0a6c3ea1d377ad17763bf130dd08c0890823380392b00094a89bf1ab51"} Feb 27 00:56:03 crc kubenswrapper[4781]: I0227 00:56:03.151713 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535896-46l2w" event={"ID":"4c6b6160-b122-4248-b7ed-a206d3bc633e","Type":"ContainerStarted","Data":"e8e722eebfb284cc61eb30213644cd7eb1815f8a77725715668d5116c8a7d0d7"} Feb 27 00:56:03 crc kubenswrapper[4781]: I0227 00:56:03.169539 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535896-46l2w" podStartSLOduration=1.511518862 podStartE2EDuration="3.169519226s" podCreationTimestamp="2026-02-27 00:56:00 +0000 UTC" firstStartedPulling="2026-02-27 00:56:00.930056336 +0000 UTC m=+3030.187595890" lastFinishedPulling="2026-02-27 00:56:02.5880567 +0000 UTC m=+3031.845596254" observedRunningTime="2026-02-27 00:56:03.164279747 +0000 UTC m=+3032.421819321" watchObservedRunningTime="2026-02-27 00:56:03.169519226 +0000 UTC m=+3032.427058780" Feb 27 00:56:04 crc kubenswrapper[4781]: I0227 00:56:04.161443 4781 generic.go:334] "Generic (PLEG): container finished" podID="4c6b6160-b122-4248-b7ed-a206d3bc633e" containerID="e8e722eebfb284cc61eb30213644cd7eb1815f8a77725715668d5116c8a7d0d7" exitCode=0 Feb 27 00:56:04 crc kubenswrapper[4781]: I0227 00:56:04.161553 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535896-46l2w" event={"ID":"4c6b6160-b122-4248-b7ed-a206d3bc633e","Type":"ContainerDied","Data":"e8e722eebfb284cc61eb30213644cd7eb1815f8a77725715668d5116c8a7d0d7"} Feb 27 00:56:05 crc kubenswrapper[4781]: I0227 00:56:05.579260 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:05 crc kubenswrapper[4781]: I0227 00:56:05.738019 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z7f8\" (UniqueName: \"kubernetes.io/projected/4c6b6160-b122-4248-b7ed-a206d3bc633e-kube-api-access-2z7f8\") pod \"4c6b6160-b122-4248-b7ed-a206d3bc633e\" (UID: \"4c6b6160-b122-4248-b7ed-a206d3bc633e\") " Feb 27 00:56:05 crc kubenswrapper[4781]: I0227 00:56:05.751831 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c6b6160-b122-4248-b7ed-a206d3bc633e-kube-api-access-2z7f8" (OuterVolumeSpecName: "kube-api-access-2z7f8") pod "4c6b6160-b122-4248-b7ed-a206d3bc633e" (UID: "4c6b6160-b122-4248-b7ed-a206d3bc633e"). InnerVolumeSpecName "kube-api-access-2z7f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:56:05 crc kubenswrapper[4781]: I0227 00:56:05.841115 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z7f8\" (UniqueName: \"kubernetes.io/projected/4c6b6160-b122-4248-b7ed-a206d3bc633e-kube-api-access-2z7f8\") on node \"crc\" DevicePath \"\"" Feb 27 00:56:06 crc kubenswrapper[4781]: I0227 00:56:06.184233 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535896-46l2w" event={"ID":"4c6b6160-b122-4248-b7ed-a206d3bc633e","Type":"ContainerDied","Data":"218f3c0a6c3ea1d377ad17763bf130dd08c0890823380392b00094a89bf1ab51"} Feb 27 00:56:06 crc kubenswrapper[4781]: I0227 00:56:06.184284 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="218f3c0a6c3ea1d377ad17763bf130dd08c0890823380392b00094a89bf1ab51" Feb 27 00:56:06 crc kubenswrapper[4781]: I0227 00:56:06.184291 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:06 crc kubenswrapper[4781]: I0227 00:56:06.238755 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535890-nv49g"] Feb 27 00:56:06 crc kubenswrapper[4781]: I0227 00:56:06.247398 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535890-nv49g"] Feb 27 00:56:07 crc kubenswrapper[4781]: I0227 00:56:07.320998 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88fd3abb-2996-49d0-851b-41e0040438fa" path="/var/lib/kubelet/pods/88fd3abb-2996-49d0-851b-41e0040438fa/volumes" Feb 27 00:56:32 crc kubenswrapper[4781]: I0227 00:56:32.008442 4781 scope.go:117] "RemoveContainer" containerID="f2bb4ab5a55c1440d2f1c4f2cba63824f23a7c027f89afef51d965a575920c2b" Feb 27 00:57:42 crc kubenswrapper[4781]: I0227 00:57:42.900100 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:57:42 crc kubenswrapper[4781]: I0227 00:57:42.900752 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.161653 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535898-vpdkx"] Feb 27 00:58:00 crc kubenswrapper[4781]: E0227 00:58:00.162965 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6b6160-b122-4248-b7ed-a206d3bc633e" containerName="oc" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.162983 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6b6160-b122-4248-b7ed-a206d3bc633e" containerName="oc" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.163243 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6b6160-b122-4248-b7ed-a206d3bc633e" containerName="oc" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.164202 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.167645 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.169821 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.170272 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.173425 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535898-vpdkx"] Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.188156 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4548k\" (UniqueName: \"kubernetes.io/projected/b518ad5e-0994-4767-9c6d-d2ca11998a43-kube-api-access-4548k\") pod \"auto-csr-approver-29535898-vpdkx\" (UID: \"b518ad5e-0994-4767-9c6d-d2ca11998a43\") " pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.290291 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4548k\" (UniqueName: \"kubernetes.io/projected/b518ad5e-0994-4767-9c6d-d2ca11998a43-kube-api-access-4548k\") pod \"auto-csr-approver-29535898-vpdkx\" (UID: \"b518ad5e-0994-4767-9c6d-d2ca11998a43\") " pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.312284 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4548k\" (UniqueName: \"kubernetes.io/projected/b518ad5e-0994-4767-9c6d-d2ca11998a43-kube-api-access-4548k\") pod \"auto-csr-approver-29535898-vpdkx\" (UID: \"b518ad5e-0994-4767-9c6d-d2ca11998a43\") " pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.484362 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.941498 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535898-vpdkx"] Feb 27 00:58:01 crc kubenswrapper[4781]: I0227 00:58:01.257799 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" event={"ID":"b518ad5e-0994-4767-9c6d-d2ca11998a43","Type":"ContainerStarted","Data":"259afc3c6a121e274fc6ab9d86321b3db6f85807994694abde31715461c872fd"} Feb 27 00:58:02 crc kubenswrapper[4781]: I0227 00:58:02.270978 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" event={"ID":"b518ad5e-0994-4767-9c6d-d2ca11998a43","Type":"ContainerStarted","Data":"5f9790a75567a30dcdf46b8e6f6e9baff3953d885f3c6f58834afe7ab39768fd"} Feb 27 00:58:02 crc kubenswrapper[4781]: I0227 00:58:02.291241 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" podStartSLOduration=1.314665779 podStartE2EDuration="2.291217992s" podCreationTimestamp="2026-02-27 00:58:00 +0000 UTC" firstStartedPulling="2026-02-27 00:58:00.950655093 +0000 UTC m=+3150.208194647" lastFinishedPulling="2026-02-27 00:58:01.927207306 +0000 UTC m=+3151.184746860" observedRunningTime="2026-02-27 00:58:02.291202331 +0000 UTC m=+3151.548741885" watchObservedRunningTime="2026-02-27 00:58:02.291217992 +0000 UTC m=+3151.548757546" Feb 27 00:58:03 crc kubenswrapper[4781]: I0227 00:58:03.295526 4781 generic.go:334] "Generic (PLEG): container finished" podID="b518ad5e-0994-4767-9c6d-d2ca11998a43" containerID="5f9790a75567a30dcdf46b8e6f6e9baff3953d885f3c6f58834afe7ab39768fd" exitCode=0 Feb 27 00:58:03 crc kubenswrapper[4781]: I0227 00:58:03.295581 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" event={"ID":"b518ad5e-0994-4767-9c6d-d2ca11998a43","Type":"ContainerDied","Data":"5f9790a75567a30dcdf46b8e6f6e9baff3953d885f3c6f58834afe7ab39768fd"} Feb 27 00:58:04 crc kubenswrapper[4781]: I0227 00:58:04.742585 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:04 crc kubenswrapper[4781]: I0227 00:58:04.874534 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4548k\" (UniqueName: \"kubernetes.io/projected/b518ad5e-0994-4767-9c6d-d2ca11998a43-kube-api-access-4548k\") pod \"b518ad5e-0994-4767-9c6d-d2ca11998a43\" (UID: \"b518ad5e-0994-4767-9c6d-d2ca11998a43\") " Feb 27 00:58:04 crc kubenswrapper[4781]: I0227 00:58:04.881956 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b518ad5e-0994-4767-9c6d-d2ca11998a43-kube-api-access-4548k" (OuterVolumeSpecName: "kube-api-access-4548k") pod "b518ad5e-0994-4767-9c6d-d2ca11998a43" (UID: "b518ad5e-0994-4767-9c6d-d2ca11998a43"). InnerVolumeSpecName "kube-api-access-4548k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:58:04 crc kubenswrapper[4781]: I0227 00:58:04.977459 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4548k\" (UniqueName: \"kubernetes.io/projected/b518ad5e-0994-4767-9c6d-d2ca11998a43-kube-api-access-4548k\") on node \"crc\" DevicePath \"\"" Feb 27 00:58:05 crc kubenswrapper[4781]: I0227 00:58:05.316226 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:05 crc kubenswrapper[4781]: I0227 00:58:05.322075 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" event={"ID":"b518ad5e-0994-4767-9c6d-d2ca11998a43","Type":"ContainerDied","Data":"259afc3c6a121e274fc6ab9d86321b3db6f85807994694abde31715461c872fd"} Feb 27 00:58:05 crc kubenswrapper[4781]: I0227 00:58:05.322122 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="259afc3c6a121e274fc6ab9d86321b3db6f85807994694abde31715461c872fd" Feb 27 00:58:05 crc kubenswrapper[4781]: I0227 00:58:05.820268 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535892-5kr7b"] Feb 27 00:58:05 crc kubenswrapper[4781]: I0227 00:58:05.831091 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535892-5kr7b"] Feb 27 00:58:07 crc kubenswrapper[4781]: I0227 00:58:07.322924 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eb81f10-0ea8-4376-b588-a3d9462c0bc4" path="/var/lib/kubelet/pods/7eb81f10-0ea8-4376-b588-a3d9462c0bc4/volumes" Feb 27 00:58:12 crc kubenswrapper[4781]: I0227 00:58:12.895155 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:58:12 crc kubenswrapper[4781]: I0227 00:58:12.895645 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:58:32 crc kubenswrapper[4781]: I0227 00:58:32.109205 4781 scope.go:117] "RemoveContainer" containerID="f073e5337bf518b81829559352fcea1859d4a5ced7771a4c11f45807c039ab0a" Feb 27 00:58:42 crc kubenswrapper[4781]: I0227 00:58:42.895537 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:58:42 crc kubenswrapper[4781]: I0227 00:58:42.896215 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:58:42 crc kubenswrapper[4781]: I0227 00:58:42.896273 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:58:42 crc kubenswrapper[4781]: I0227 00:58:42.897143 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:58:42 crc kubenswrapper[4781]: I0227 00:58:42.897197 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" gracePeriod=600 Feb 27 00:58:43 crc kubenswrapper[4781]: E0227 00:58:43.231419 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c19e2e_0830_47a5_9ea8_862e1c9d8571.slice/crio-conmon-af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:58:43 crc kubenswrapper[4781]: E0227 00:58:43.522507 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:58:43 crc kubenswrapper[4781]: I0227 00:58:43.687014 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" exitCode=0 Feb 27 00:58:43 crc kubenswrapper[4781]: I0227 00:58:43.687062 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3"} Feb 27 00:58:43 crc kubenswrapper[4781]: I0227 00:58:43.687098 4781 scope.go:117] "RemoveContainer" containerID="9eb13c9d0480acfdc7ab15c203347b34a63e0504efc9127264d926b2dd0b3a20" Feb 27 00:58:43 crc kubenswrapper[4781]: I0227 00:58:43.687804 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 00:58:43 crc kubenswrapper[4781]: E0227 00:58:43.688117 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:58:56 crc kubenswrapper[4781]: I0227 00:58:56.309955 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 00:58:56 crc kubenswrapper[4781]: E0227 00:58:56.310868 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:59:08 crc kubenswrapper[4781]: I0227 00:59:08.309468 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 00:59:08 crc kubenswrapper[4781]: E0227 00:59:08.310180 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.388832 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f2nl8"] Feb 27 00:59:11 crc kubenswrapper[4781]: E0227 00:59:11.391091 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b518ad5e-0994-4767-9c6d-d2ca11998a43" containerName="oc" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.391128 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b518ad5e-0994-4767-9c6d-d2ca11998a43" containerName="oc" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.391361 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b518ad5e-0994-4767-9c6d-d2ca11998a43" containerName="oc" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.393311 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.403578 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2nl8"] Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.573292 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxlwl\" (UniqueName: \"kubernetes.io/projected/debd5f2a-600a-4378-9ec7-133418b38ffe-kube-api-access-fxlwl\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.573411 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-utilities\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.573445 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-catalog-content\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.675110 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxlwl\" (UniqueName: \"kubernetes.io/projected/debd5f2a-600a-4378-9ec7-133418b38ffe-kube-api-access-fxlwl\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.675213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-utilities\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.675238 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-catalog-content\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.676079 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-catalog-content\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.676260 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-utilities\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.702614 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxlwl\" (UniqueName: \"kubernetes.io/projected/debd5f2a-600a-4378-9ec7-133418b38ffe-kube-api-access-fxlwl\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.716345 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:12 crc kubenswrapper[4781]: I0227 00:59:12.173151 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2nl8"] Feb 27 00:59:12 crc kubenswrapper[4781]: I0227 00:59:12.958020 4781 generic.go:334] "Generic (PLEG): container finished" podID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerID="ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4" exitCode=0 Feb 27 00:59:12 crc kubenswrapper[4781]: I0227 00:59:12.958255 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerDied","Data":"ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4"} Feb 27 00:59:12 crc kubenswrapper[4781]: I0227 00:59:12.958282 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerStarted","Data":"e8b6bf74d32142b7644d81c9992f7b8fe97a8df0b6f36a4ecbde8a8ead769a8e"} Feb 27 00:59:12 crc kubenswrapper[4781]: I0227 00:59:12.960453 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:59:13 crc kubenswrapper[4781]: I0227 00:59:13.968172 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerStarted","Data":"d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55"} Feb 27 00:59:14 crc kubenswrapper[4781]: I0227 00:59:14.979734 4781 generic.go:334] "Generic (PLEG): container finished" podID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerID="d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55" exitCode=0 Feb 27 00:59:14 crc kubenswrapper[4781]: I0227 00:59:14.979815 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerDied","Data":"d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55"} Feb 27 00:59:15 crc kubenswrapper[4781]: I0227 00:59:15.998845 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerStarted","Data":"89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485"} Feb 27 00:59:16 crc kubenswrapper[4781]: I0227 00:59:16.023459 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f2nl8" podStartSLOduration=2.522354554 podStartE2EDuration="5.023441483s" podCreationTimestamp="2026-02-27 00:59:11 +0000 UTC" firstStartedPulling="2026-02-27 00:59:12.960145797 +0000 UTC m=+3222.217685351" lastFinishedPulling="2026-02-27 00:59:15.461232726 +0000 UTC m=+3224.718772280" observedRunningTime="2026-02-27 00:59:16.016443308 +0000 UTC m=+3225.273982872" watchObservedRunningTime="2026-02-27 00:59:16.023441483 +0000 UTC m=+3225.280981037" Feb 27 00:59:21 crc kubenswrapper[4781]: I0227 00:59:21.321310 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 00:59:21 crc kubenswrapper[4781]: E0227 00:59:21.322117 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:59:21 crc kubenswrapper[4781]: I0227 00:59:21.716649 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:21 crc kubenswrapper[4781]: I0227 00:59:21.717082 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:21 crc kubenswrapper[4781]: I0227 00:59:21.773377 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:22 crc kubenswrapper[4781]: I0227 00:59:22.109309 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:22 crc kubenswrapper[4781]: I0227 00:59:22.156496 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2nl8"] Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.072848 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f2nl8" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="registry-server" containerID="cri-o://89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485" gracePeriod=2 Feb 27 00:59:24 crc kubenswrapper[4781]: E0227 00:59:24.241602 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddebd5f2a_600a_4378_9ec7_133418b38ffe.slice/crio-conmon-89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddebd5f2a_600a_4378_9ec7_133418b38ffe.slice/crio-89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.766243 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.860124 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxlwl\" (UniqueName: \"kubernetes.io/projected/debd5f2a-600a-4378-9ec7-133418b38ffe-kube-api-access-fxlwl\") pod \"debd5f2a-600a-4378-9ec7-133418b38ffe\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.860278 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-utilities\") pod \"debd5f2a-600a-4378-9ec7-133418b38ffe\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.860362 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-catalog-content\") pod \"debd5f2a-600a-4378-9ec7-133418b38ffe\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.861211 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-utilities" (OuterVolumeSpecName: "utilities") pod "debd5f2a-600a-4378-9ec7-133418b38ffe" (UID: "debd5f2a-600a-4378-9ec7-133418b38ffe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.861694 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.876107 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/debd5f2a-600a-4378-9ec7-133418b38ffe-kube-api-access-fxlwl" (OuterVolumeSpecName: "kube-api-access-fxlwl") pod "debd5f2a-600a-4378-9ec7-133418b38ffe" (UID: "debd5f2a-600a-4378-9ec7-133418b38ffe"). InnerVolumeSpecName "kube-api-access-fxlwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.899675 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "debd5f2a-600a-4378-9ec7-133418b38ffe" (UID: "debd5f2a-600a-4378-9ec7-133418b38ffe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.963843 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxlwl\" (UniqueName: \"kubernetes.io/projected/debd5f2a-600a-4378-9ec7-133418b38ffe-kube-api-access-fxlwl\") on node \"crc\" DevicePath \"\"" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.964220 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.083299 4781 generic.go:334] "Generic (PLEG): container finished" podID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerID="89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485" exitCode=0 Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.083339 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerDied","Data":"89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485"} Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.083352 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.083381 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerDied","Data":"e8b6bf74d32142b7644d81c9992f7b8fe97a8df0b6f36a4ecbde8a8ead769a8e"} Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.083399 4781 scope.go:117] "RemoveContainer" containerID="89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.110054 4781 scope.go:117] "RemoveContainer" containerID="d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.123745 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2nl8"] Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.136433 4781 scope.go:117] "RemoveContainer" containerID="ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.136665 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2nl8"] Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.192421 4781 scope.go:117] "RemoveContainer" containerID="89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485" Feb 27 00:59:25 crc kubenswrapper[4781]: E0227 00:59:25.192907 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485\": container with ID starting with 89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485 not found: ID does not exist" containerID="89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.192976 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485"} err="failed to get container status \"89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485\": rpc error: code = NotFound desc = could not find container \"89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485\": container with ID starting with 89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485 not found: ID does not exist" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.193004 4781 scope.go:117] "RemoveContainer" containerID="d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55" Feb 27 00:59:25 crc kubenswrapper[4781]: E0227 00:59:25.194383 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55\": container with ID starting with d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55 not found: ID does not exist" containerID="d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.194441 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55"} err="failed to get container status \"d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55\": rpc error: code = NotFound desc = could not find container \"d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55\": container with ID starting with d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55 not found: ID does not exist" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.194477 4781 scope.go:117] "RemoveContainer" containerID="ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4" Feb 27 00:59:25 crc kubenswrapper[4781]: E0227 00:59:25.195048 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4\": container with ID starting with ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4 not found: ID does not exist" containerID="ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.195734 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4"} err="failed to get container status \"ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4\": rpc error: code = NotFound desc = could not find container \"ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4\": container with ID starting with ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4 not found: ID does not exist" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.321368 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" path="/var/lib/kubelet/pods/debd5f2a-600a-4378-9ec7-133418b38ffe/volumes" Feb 27 00:59:35 crc kubenswrapper[4781]: I0227 00:59:35.311307 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 00:59:35 crc kubenswrapper[4781]: E0227 00:59:35.312071 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:59:50 crc kubenswrapper[4781]: I0227 00:59:50.309297 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 00:59:50 crc kubenswrapper[4781]: E0227 00:59:50.310080 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.182036 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535900-6n6zr"] Feb 27 01:00:00 crc kubenswrapper[4781]: E0227 01:00:00.183155 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="registry-server" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.183172 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="registry-server" Feb 27 01:00:00 crc kubenswrapper[4781]: E0227 01:00:00.183193 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="extract-content" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.183203 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="extract-content" Feb 27 01:00:00 crc kubenswrapper[4781]: E0227 01:00:00.183226 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="extract-utilities" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.183235 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="extract-utilities" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.183535 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="registry-server" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.184421 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.187729 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.188260 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.189935 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.195413 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc"] Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.197307 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.200776 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.203154 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.210755 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535900-6n6zr"] Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.219417 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc"] Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.297918 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4be24050-0334-412d-9c01-525815caef28-secret-volume\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.298028 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4be24050-0334-412d-9c01-525815caef28-config-volume\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.298138 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f99cs\" (UniqueName: \"kubernetes.io/projected/4be24050-0334-412d-9c01-525815caef28-kube-api-access-f99cs\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.298277 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srjss\" (UniqueName: \"kubernetes.io/projected/0aac78d6-5f5c-4b48-95f2-554353abcdd3-kube-api-access-srjss\") pod \"auto-csr-approver-29535900-6n6zr\" (UID: \"0aac78d6-5f5c-4b48-95f2-554353abcdd3\") " pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.399692 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srjss\" (UniqueName: \"kubernetes.io/projected/0aac78d6-5f5c-4b48-95f2-554353abcdd3-kube-api-access-srjss\") pod \"auto-csr-approver-29535900-6n6zr\" (UID: \"0aac78d6-5f5c-4b48-95f2-554353abcdd3\") " pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.399790 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4be24050-0334-412d-9c01-525815caef28-secret-volume\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.399827 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4be24050-0334-412d-9c01-525815caef28-config-volume\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.399902 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f99cs\" (UniqueName: \"kubernetes.io/projected/4be24050-0334-412d-9c01-525815caef28-kube-api-access-f99cs\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.400913 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4be24050-0334-412d-9c01-525815caef28-config-volume\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.406676 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4be24050-0334-412d-9c01-525815caef28-secret-volume\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.419812 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srjss\" (UniqueName: \"kubernetes.io/projected/0aac78d6-5f5c-4b48-95f2-554353abcdd3-kube-api-access-srjss\") pod \"auto-csr-approver-29535900-6n6zr\" (UID: \"0aac78d6-5f5c-4b48-95f2-554353abcdd3\") " pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.420493 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f99cs\" (UniqueName: \"kubernetes.io/projected/4be24050-0334-412d-9c01-525815caef28-kube-api-access-f99cs\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.511308 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.526563 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:01 crc kubenswrapper[4781]: I0227 01:00:01.000735 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc"] Feb 27 01:00:01 crc kubenswrapper[4781]: I0227 01:00:01.105433 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535900-6n6zr"] Feb 27 01:00:01 crc kubenswrapper[4781]: W0227 01:00:01.124915 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aac78d6_5f5c_4b48_95f2_554353abcdd3.slice/crio-f7d1c70db713d929fdac86351a40f8e0c653d199ec79319ad594457a51783afe WatchSource:0}: Error finding container f7d1c70db713d929fdac86351a40f8e0c653d199ec79319ad594457a51783afe: Status 404 returned error can't find the container with id f7d1c70db713d929fdac86351a40f8e0c653d199ec79319ad594457a51783afe Feb 27 01:00:01 crc kubenswrapper[4781]: I0227 01:00:01.425803 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" event={"ID":"0aac78d6-5f5c-4b48-95f2-554353abcdd3","Type":"ContainerStarted","Data":"f7d1c70db713d929fdac86351a40f8e0c653d199ec79319ad594457a51783afe"} Feb 27 01:00:01 crc kubenswrapper[4781]: I0227 01:00:01.427380 4781 generic.go:334] "Generic (PLEG): container finished" podID="4be24050-0334-412d-9c01-525815caef28" containerID="88af5d2f7a10fab35719beb6fe30d40bbd5167a8ef25825d9123b0d5e4f7e563" exitCode=0 Feb 27 01:00:01 crc kubenswrapper[4781]: I0227 01:00:01.427442 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" event={"ID":"4be24050-0334-412d-9c01-525815caef28","Type":"ContainerDied","Data":"88af5d2f7a10fab35719beb6fe30d40bbd5167a8ef25825d9123b0d5e4f7e563"} Feb 27 01:00:01 crc kubenswrapper[4781]: I0227 01:00:01.427473 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" event={"ID":"4be24050-0334-412d-9c01-525815caef28","Type":"ContainerStarted","Data":"9aada356d2480234caecb610cda7a6bb9e6ec4d19386ef97e88fd09cb849c15f"} Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.310425 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:00:02 crc kubenswrapper[4781]: E0227 01:00:02.311313 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.875861 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.961751 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4be24050-0334-412d-9c01-525815caef28-config-volume\") pod \"4be24050-0334-412d-9c01-525815caef28\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.961929 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4be24050-0334-412d-9c01-525815caef28-secret-volume\") pod \"4be24050-0334-412d-9c01-525815caef28\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.961969 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f99cs\" (UniqueName: \"kubernetes.io/projected/4be24050-0334-412d-9c01-525815caef28-kube-api-access-f99cs\") pod \"4be24050-0334-412d-9c01-525815caef28\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.963647 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4be24050-0334-412d-9c01-525815caef28-config-volume" (OuterVolumeSpecName: "config-volume") pod "4be24050-0334-412d-9c01-525815caef28" (UID: "4be24050-0334-412d-9c01-525815caef28"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.969523 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be24050-0334-412d-9c01-525815caef28-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4be24050-0334-412d-9c01-525815caef28" (UID: "4be24050-0334-412d-9c01-525815caef28"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.974407 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be24050-0334-412d-9c01-525815caef28-kube-api-access-f99cs" (OuterVolumeSpecName: "kube-api-access-f99cs") pod "4be24050-0334-412d-9c01-525815caef28" (UID: "4be24050-0334-412d-9c01-525815caef28"). InnerVolumeSpecName "kube-api-access-f99cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.064266 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4be24050-0334-412d-9c01-525815caef28-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.064364 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f99cs\" (UniqueName: \"kubernetes.io/projected/4be24050-0334-412d-9c01-525815caef28-kube-api-access-f99cs\") on node \"crc\" DevicePath \"\"" Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.064374 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4be24050-0334-412d-9c01-525815caef28-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.451169 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" event={"ID":"4be24050-0334-412d-9c01-525815caef28","Type":"ContainerDied","Data":"9aada356d2480234caecb610cda7a6bb9e6ec4d19386ef97e88fd09cb849c15f"} Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.451218 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aada356d2480234caecb610cda7a6bb9e6ec4d19386ef97e88fd09cb849c15f" Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.451278 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.945936 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h"] Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.955202 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h"] Feb 27 01:00:05 crc kubenswrapper[4781]: I0227 01:00:05.357101 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b44418-6039-4859-96ba-1442e52b290e" path="/var/lib/kubelet/pods/22b44418-6039-4859-96ba-1442e52b290e/volumes" Feb 27 01:00:05 crc kubenswrapper[4781]: I0227 01:00:05.473421 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" event={"ID":"0aac78d6-5f5c-4b48-95f2-554353abcdd3","Type":"ContainerStarted","Data":"fb76bcf8730e0171831c959b0a00779c7b469f5264f4c1f6152625c6f8db5a04"} Feb 27 01:00:05 crc kubenswrapper[4781]: I0227 01:00:05.497506 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" podStartSLOduration=1.8308608990000002 podStartE2EDuration="5.497483978s" podCreationTimestamp="2026-02-27 01:00:00 +0000 UTC" firstStartedPulling="2026-02-27 01:00:01.12980955 +0000 UTC m=+3270.387349114" lastFinishedPulling="2026-02-27 01:00:04.796432639 +0000 UTC m=+3274.053972193" observedRunningTime="2026-02-27 01:00:05.486710333 +0000 UTC m=+3274.744249907" watchObservedRunningTime="2026-02-27 01:00:05.497483978 +0000 UTC m=+3274.755023542" Feb 27 01:00:06 crc kubenswrapper[4781]: I0227 01:00:06.484237 4781 generic.go:334] "Generic (PLEG): container finished" podID="0aac78d6-5f5c-4b48-95f2-554353abcdd3" containerID="fb76bcf8730e0171831c959b0a00779c7b469f5264f4c1f6152625c6f8db5a04" exitCode=0 Feb 27 01:00:06 crc kubenswrapper[4781]: I0227 01:00:06.484296 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" event={"ID":"0aac78d6-5f5c-4b48-95f2-554353abcdd3","Type":"ContainerDied","Data":"fb76bcf8730e0171831c959b0a00779c7b469f5264f4c1f6152625c6f8db5a04"} Feb 27 01:00:07 crc kubenswrapper[4781]: I0227 01:00:07.929230 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:07 crc kubenswrapper[4781]: I0227 01:00:07.964048 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srjss\" (UniqueName: \"kubernetes.io/projected/0aac78d6-5f5c-4b48-95f2-554353abcdd3-kube-api-access-srjss\") pod \"0aac78d6-5f5c-4b48-95f2-554353abcdd3\" (UID: \"0aac78d6-5f5c-4b48-95f2-554353abcdd3\") " Feb 27 01:00:07 crc kubenswrapper[4781]: I0227 01:00:07.970108 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aac78d6-5f5c-4b48-95f2-554353abcdd3-kube-api-access-srjss" (OuterVolumeSpecName: "kube-api-access-srjss") pod "0aac78d6-5f5c-4b48-95f2-554353abcdd3" (UID: "0aac78d6-5f5c-4b48-95f2-554353abcdd3"). InnerVolumeSpecName "kube-api-access-srjss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:00:08 crc kubenswrapper[4781]: I0227 01:00:08.066974 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srjss\" (UniqueName: \"kubernetes.io/projected/0aac78d6-5f5c-4b48-95f2-554353abcdd3-kube-api-access-srjss\") on node \"crc\" DevicePath \"\"" Feb 27 01:00:08 crc kubenswrapper[4781]: I0227 01:00:08.505903 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" event={"ID":"0aac78d6-5f5c-4b48-95f2-554353abcdd3","Type":"ContainerDied","Data":"f7d1c70db713d929fdac86351a40f8e0c653d199ec79319ad594457a51783afe"} Feb 27 01:00:08 crc kubenswrapper[4781]: I0227 01:00:08.506248 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7d1c70db713d929fdac86351a40f8e0c653d199ec79319ad594457a51783afe" Feb 27 01:00:08 crc kubenswrapper[4781]: I0227 01:00:08.505944 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:08 crc kubenswrapper[4781]: I0227 01:00:08.557799 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535894-wdzjg"] Feb 27 01:00:08 crc kubenswrapper[4781]: I0227 01:00:08.571336 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535894-wdzjg"] Feb 27 01:00:09 crc kubenswrapper[4781]: I0227 01:00:09.321371 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa" path="/var/lib/kubelet/pods/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa/volumes" Feb 27 01:00:17 crc kubenswrapper[4781]: I0227 01:00:17.310100 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:00:17 crc kubenswrapper[4781]: E0227 01:00:17.310741 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:00:30 crc kubenswrapper[4781]: I0227 01:00:30.310538 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:00:30 crc kubenswrapper[4781]: E0227 01:00:30.311392 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:00:32 crc kubenswrapper[4781]: I0227 01:00:32.233545 4781 scope.go:117] "RemoveContainer" containerID="1f1ef56dac2e7ed3023bb30987d569aec06c9a96b99c1e9e939085397f33ecaf" Feb 27 01:00:32 crc kubenswrapper[4781]: I0227 01:00:32.274224 4781 scope.go:117] "RemoveContainer" containerID="cd5eb21f935374fa744e81c6189b26e6ff6841a0ef882762f86735b2bdaec5ee" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.868458 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cwjz6"] Feb 27 01:00:40 crc kubenswrapper[4781]: E0227 01:00:40.870422 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aac78d6-5f5c-4b48-95f2-554353abcdd3" containerName="oc" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.870443 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aac78d6-5f5c-4b48-95f2-554353abcdd3" containerName="oc" Feb 27 01:00:40 crc kubenswrapper[4781]: E0227 01:00:40.870469 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be24050-0334-412d-9c01-525815caef28" containerName="collect-profiles" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.870478 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be24050-0334-412d-9c01-525815caef28" containerName="collect-profiles" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.882429 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be24050-0334-412d-9c01-525815caef28" containerName="collect-profiles" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.882478 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aac78d6-5f5c-4b48-95f2-554353abcdd3" containerName="oc" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.886903 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.891242 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cwjz6"] Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.946001 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gkj9\" (UniqueName: \"kubernetes.io/projected/8645ad19-e972-4563-8c61-0b409e68654f-kube-api-access-2gkj9\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.946156 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-catalog-content\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.946182 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-utilities\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.048009 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-catalog-content\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.048376 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-utilities\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.048572 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gkj9\" (UniqueName: \"kubernetes.io/projected/8645ad19-e972-4563-8c61-0b409e68654f-kube-api-access-2gkj9\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.048980 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-catalog-content\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.048980 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-utilities\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.077653 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gkj9\" (UniqueName: \"kubernetes.io/projected/8645ad19-e972-4563-8c61-0b409e68654f-kube-api-access-2gkj9\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.221836 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.757978 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cwjz6"] Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.825352 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerStarted","Data":"ce0753b848865fc62648932e9ede65f00c3e009f79642c333fed3ad246907b56"} Feb 27 01:00:42 crc kubenswrapper[4781]: I0227 01:00:42.835798 4781 generic.go:334] "Generic (PLEG): container finished" podID="8645ad19-e972-4563-8c61-0b409e68654f" containerID="f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4" exitCode=0 Feb 27 01:00:42 crc kubenswrapper[4781]: I0227 01:00:42.835838 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerDied","Data":"f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4"} Feb 27 01:00:44 crc kubenswrapper[4781]: I0227 01:00:44.309172 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:00:44 crc kubenswrapper[4781]: E0227 01:00:44.309892 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:00:59 crc kubenswrapper[4781]: I0227 01:00:59.309776 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:00:59 crc kubenswrapper[4781]: E0227 01:00:59.310581 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.152699 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29535901-2chr7"] Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.154116 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.170107 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535901-2chr7"] Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.260702 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lts4\" (UniqueName: \"kubernetes.io/projected/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-kube-api-access-2lts4\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.260993 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-combined-ca-bundle\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.261093 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-config-data\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.261157 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-fernet-keys\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.363750 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lts4\" (UniqueName: \"kubernetes.io/projected/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-kube-api-access-2lts4\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.363924 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-combined-ca-bundle\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.363958 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-config-data\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.364014 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-fernet-keys\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.370148 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-config-data\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.371233 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-fernet-keys\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.371257 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-combined-ca-bundle\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.381089 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lts4\" (UniqueName: \"kubernetes.io/projected/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-kube-api-access-2lts4\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.478166 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.981524 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535901-2chr7"] Feb 27 01:01:01 crc kubenswrapper[4781]: I0227 01:01:01.019915 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerStarted","Data":"8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb"} Feb 27 01:01:02 crc kubenswrapper[4781]: I0227 01:01:02.030751 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535901-2chr7" event={"ID":"8f6a0640-2204-47a2-a550-7a7bb14ebc0d","Type":"ContainerStarted","Data":"7557da63e2de03059baf804218735736137f6a0ae74a8bf68e61c8dde24476f0"} Feb 27 01:01:02 crc kubenswrapper[4781]: I0227 01:01:02.031121 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535901-2chr7" event={"ID":"8f6a0640-2204-47a2-a550-7a7bb14ebc0d","Type":"ContainerStarted","Data":"537cda34ad3149982b0c8785cb238f7f78c9e98fdecdc0beb26e59d3dd7545ce"} Feb 27 01:01:02 crc kubenswrapper[4781]: I0227 01:01:02.049237 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29535901-2chr7" podStartSLOduration=2.049216542 podStartE2EDuration="2.049216542s" podCreationTimestamp="2026-02-27 01:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:01:02.04500643 +0000 UTC m=+3331.302545994" watchObservedRunningTime="2026-02-27 01:01:02.049216542 +0000 UTC m=+3331.306756096" Feb 27 01:01:06 crc kubenswrapper[4781]: I0227 01:01:06.067248 4781 generic.go:334] "Generic (PLEG): container finished" podID="8645ad19-e972-4563-8c61-0b409e68654f" containerID="8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb" exitCode=0 Feb 27 01:01:06 crc kubenswrapper[4781]: I0227 01:01:06.067319 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerDied","Data":"8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb"} Feb 27 01:01:07 crc kubenswrapper[4781]: I0227 01:01:07.081338 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerStarted","Data":"fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1"} Feb 27 01:01:07 crc kubenswrapper[4781]: I0227 01:01:07.083282 4781 generic.go:334] "Generic (PLEG): container finished" podID="8f6a0640-2204-47a2-a550-7a7bb14ebc0d" containerID="7557da63e2de03059baf804218735736137f6a0ae74a8bf68e61c8dde24476f0" exitCode=0 Feb 27 01:01:07 crc kubenswrapper[4781]: I0227 01:01:07.083329 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535901-2chr7" event={"ID":"8f6a0640-2204-47a2-a550-7a7bb14ebc0d","Type":"ContainerDied","Data":"7557da63e2de03059baf804218735736137f6a0ae74a8bf68e61c8dde24476f0"} Feb 27 01:01:07 crc kubenswrapper[4781]: I0227 01:01:07.109343 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cwjz6" podStartSLOduration=3.192889401 podStartE2EDuration="27.10932431s" podCreationTimestamp="2026-02-27 01:00:40 +0000 UTC" firstStartedPulling="2026-02-27 01:00:42.837876343 +0000 UTC m=+3312.095415887" lastFinishedPulling="2026-02-27 01:01:06.754311242 +0000 UTC m=+3336.011850796" observedRunningTime="2026-02-27 01:01:07.102994903 +0000 UTC m=+3336.360534457" watchObservedRunningTime="2026-02-27 01:01:07.10932431 +0000 UTC m=+3336.366863864" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.617611 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.650295 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lts4\" (UniqueName: \"kubernetes.io/projected/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-kube-api-access-2lts4\") pod \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.650376 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-fernet-keys\") pod \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.650447 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-combined-ca-bundle\") pod \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.650643 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-config-data\") pod \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.658721 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8f6a0640-2204-47a2-a550-7a7bb14ebc0d" (UID: "8f6a0640-2204-47a2-a550-7a7bb14ebc0d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.667860 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-kube-api-access-2lts4" (OuterVolumeSpecName: "kube-api-access-2lts4") pod "8f6a0640-2204-47a2-a550-7a7bb14ebc0d" (UID: "8f6a0640-2204-47a2-a550-7a7bb14ebc0d"). InnerVolumeSpecName "kube-api-access-2lts4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.697054 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f6a0640-2204-47a2-a550-7a7bb14ebc0d" (UID: "8f6a0640-2204-47a2-a550-7a7bb14ebc0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.724071 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-config-data" (OuterVolumeSpecName: "config-data") pod "8f6a0640-2204-47a2-a550-7a7bb14ebc0d" (UID: "8f6a0640-2204-47a2-a550-7a7bb14ebc0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.752971 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.753022 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lts4\" (UniqueName: \"kubernetes.io/projected/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-kube-api-access-2lts4\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.753048 4781 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.753059 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:09 crc kubenswrapper[4781]: I0227 01:01:09.104757 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535901-2chr7" event={"ID":"8f6a0640-2204-47a2-a550-7a7bb14ebc0d","Type":"ContainerDied","Data":"537cda34ad3149982b0c8785cb238f7f78c9e98fdecdc0beb26e59d3dd7545ce"} Feb 27 01:01:09 crc kubenswrapper[4781]: I0227 01:01:09.104805 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="537cda34ad3149982b0c8785cb238f7f78c9e98fdecdc0beb26e59d3dd7545ce" Feb 27 01:01:09 crc kubenswrapper[4781]: I0227 01:01:09.104841 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:11 crc kubenswrapper[4781]: I0227 01:01:11.222087 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:01:11 crc kubenswrapper[4781]: I0227 01:01:11.222701 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:01:12 crc kubenswrapper[4781]: I0227 01:01:12.271651 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cwjz6" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" probeResult="failure" output=< Feb 27 01:01:12 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 01:01:12 crc kubenswrapper[4781]: > Feb 27 01:01:14 crc kubenswrapper[4781]: I0227 01:01:14.310167 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:01:14 crc kubenswrapper[4781]: E0227 01:01:14.310738 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:01:22 crc kubenswrapper[4781]: I0227 01:01:22.269433 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cwjz6" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" probeResult="failure" output=< Feb 27 01:01:22 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 01:01:22 crc kubenswrapper[4781]: > Feb 27 01:01:25 crc kubenswrapper[4781]: I0227 01:01:25.309404 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:01:25 crc kubenswrapper[4781]: E0227 01:01:25.310347 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:01:32 crc kubenswrapper[4781]: I0227 01:01:32.269559 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cwjz6" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" probeResult="failure" output=< Feb 27 01:01:32 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 01:01:32 crc kubenswrapper[4781]: > Feb 27 01:01:39 crc kubenswrapper[4781]: I0227 01:01:39.309550 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:01:39 crc kubenswrapper[4781]: E0227 01:01:39.310462 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:01:41 crc kubenswrapper[4781]: I0227 01:01:41.273199 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:01:41 crc kubenswrapper[4781]: I0227 01:01:41.324404 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:01:42 crc kubenswrapper[4781]: I0227 01:01:42.085873 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cwjz6"] Feb 27 01:01:42 crc kubenswrapper[4781]: I0227 01:01:42.459552 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cwjz6" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" containerID="cri-o://fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1" gracePeriod=2 Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.026295 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.150285 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gkj9\" (UniqueName: \"kubernetes.io/projected/8645ad19-e972-4563-8c61-0b409e68654f-kube-api-access-2gkj9\") pod \"8645ad19-e972-4563-8c61-0b409e68654f\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.150658 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-catalog-content\") pod \"8645ad19-e972-4563-8c61-0b409e68654f\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.150709 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-utilities\") pod \"8645ad19-e972-4563-8c61-0b409e68654f\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.151525 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-utilities" (OuterVolumeSpecName: "utilities") pod "8645ad19-e972-4563-8c61-0b409e68654f" (UID: "8645ad19-e972-4563-8c61-0b409e68654f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.152301 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.164538 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8645ad19-e972-4563-8c61-0b409e68654f-kube-api-access-2gkj9" (OuterVolumeSpecName: "kube-api-access-2gkj9") pod "8645ad19-e972-4563-8c61-0b409e68654f" (UID: "8645ad19-e972-4563-8c61-0b409e68654f"). InnerVolumeSpecName "kube-api-access-2gkj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.254990 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gkj9\" (UniqueName: \"kubernetes.io/projected/8645ad19-e972-4563-8c61-0b409e68654f-kube-api-access-2gkj9\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.278607 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8645ad19-e972-4563-8c61-0b409e68654f" (UID: "8645ad19-e972-4563-8c61-0b409e68654f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.358913 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.474921 4781 generic.go:334] "Generic (PLEG): container finished" podID="8645ad19-e972-4563-8c61-0b409e68654f" containerID="fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1" exitCode=0 Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.474970 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerDied","Data":"fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1"} Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.475001 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerDied","Data":"ce0753b848865fc62648932e9ede65f00c3e009f79642c333fed3ad246907b56"} Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.475017 4781 scope.go:117] "RemoveContainer" containerID="fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.475155 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.495875 4781 scope.go:117] "RemoveContainer" containerID="8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.501089 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cwjz6"] Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.511088 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cwjz6"] Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.528425 4781 scope.go:117] "RemoveContainer" containerID="f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.619902 4781 scope.go:117] "RemoveContainer" containerID="fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1" Feb 27 01:01:43 crc kubenswrapper[4781]: E0227 01:01:43.620995 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1\": container with ID starting with fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1 not found: ID does not exist" containerID="fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.621048 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1"} err="failed to get container status \"fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1\": rpc error: code = NotFound desc = could not find container \"fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1\": container with ID starting with fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1 not found: ID does not exist" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.621080 4781 scope.go:117] "RemoveContainer" containerID="8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb" Feb 27 01:01:43 crc kubenswrapper[4781]: E0227 01:01:43.621929 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb\": container with ID starting with 8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb not found: ID does not exist" containerID="8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.621970 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb"} err="failed to get container status \"8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb\": rpc error: code = NotFound desc = could not find container \"8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb\": container with ID starting with 8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb not found: ID does not exist" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.621997 4781 scope.go:117] "RemoveContainer" containerID="f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4" Feb 27 01:01:43 crc kubenswrapper[4781]: E0227 01:01:43.623162 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4\": container with ID starting with f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4 not found: ID does not exist" containerID="f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.623205 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4"} err="failed to get container status \"f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4\": rpc error: code = NotFound desc = could not find container \"f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4\": container with ID starting with f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4 not found: ID does not exist" Feb 27 01:01:45 crc kubenswrapper[4781]: I0227 01:01:45.326607 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8645ad19-e972-4563-8c61-0b409e68654f" path="/var/lib/kubelet/pods/8645ad19-e972-4563-8c61-0b409e68654f/volumes" Feb 27 01:01:54 crc kubenswrapper[4781]: I0227 01:01:54.309919 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:01:54 crc kubenswrapper[4781]: E0227 01:01:54.312045 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.251669 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jdd4z"] Feb 27 01:01:58 crc kubenswrapper[4781]: E0227 01:01:58.252764 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="extract-content" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.252782 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="extract-content" Feb 27 01:01:58 crc kubenswrapper[4781]: E0227 01:01:58.252802 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6a0640-2204-47a2-a550-7a7bb14ebc0d" containerName="keystone-cron" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.252811 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6a0640-2204-47a2-a550-7a7bb14ebc0d" containerName="keystone-cron" Feb 27 01:01:58 crc kubenswrapper[4781]: E0227 01:01:58.252832 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.252840 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" Feb 27 01:01:58 crc kubenswrapper[4781]: E0227 01:01:58.252872 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="extract-utilities" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.252880 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="extract-utilities" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.253072 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.253084 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6a0640-2204-47a2-a550-7a7bb14ebc0d" containerName="keystone-cron" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.254527 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.273135 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jdd4z"] Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.383932 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-utilities\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.384323 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bsth\" (UniqueName: \"kubernetes.io/projected/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-kube-api-access-2bsth\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.384449 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-catalog-content\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.487959 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-catalog-content\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.488322 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-utilities\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.488443 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bsth\" (UniqueName: \"kubernetes.io/projected/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-kube-api-access-2bsth\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.488931 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-utilities\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.489160 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-catalog-content\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.512365 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bsth\" (UniqueName: \"kubernetes.io/projected/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-kube-api-access-2bsth\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.582032 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:59 crc kubenswrapper[4781]: I0227 01:01:59.140756 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jdd4z"] Feb 27 01:01:59 crc kubenswrapper[4781]: I0227 01:01:59.638325 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerID="026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808" exitCode=0 Feb 27 01:01:59 crc kubenswrapper[4781]: I0227 01:01:59.638368 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerDied","Data":"026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808"} Feb 27 01:01:59 crc kubenswrapper[4781]: I0227 01:01:59.638394 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerStarted","Data":"df273d841b3449944d268e56a0e28f11d82f21d7f09bba2a5b92b1c4ad1eb152"} Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.179923 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535902-bkgwj"] Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.181414 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.185961 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.186817 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.199087 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535902-bkgwj"] Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.199282 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.328045 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-682f8\" (UniqueName: \"kubernetes.io/projected/da201e7f-72da-4998-8ecb-98a8814f423d-kube-api-access-682f8\") pod \"auto-csr-approver-29535902-bkgwj\" (UID: \"da201e7f-72da-4998-8ecb-98a8814f423d\") " pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.431392 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-682f8\" (UniqueName: \"kubernetes.io/projected/da201e7f-72da-4998-8ecb-98a8814f423d-kube-api-access-682f8\") pod \"auto-csr-approver-29535902-bkgwj\" (UID: \"da201e7f-72da-4998-8ecb-98a8814f423d\") " pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.450700 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-682f8\" (UniqueName: \"kubernetes.io/projected/da201e7f-72da-4998-8ecb-98a8814f423d-kube-api-access-682f8\") pod \"auto-csr-approver-29535902-bkgwj\" (UID: \"da201e7f-72da-4998-8ecb-98a8814f423d\") " pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.499680 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:01 crc kubenswrapper[4781]: I0227 01:02:01.010691 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535902-bkgwj"] Feb 27 01:02:01 crc kubenswrapper[4781]: I0227 01:02:01.662536 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" event={"ID":"da201e7f-72da-4998-8ecb-98a8814f423d","Type":"ContainerStarted","Data":"ba1dabaf3614d6322961d2b5cde702b8395ef307cf5a984baf680198a1f73dc4"} Feb 27 01:02:01 crc kubenswrapper[4781]: I0227 01:02:01.665863 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerStarted","Data":"a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a"} Feb 27 01:02:04 crc kubenswrapper[4781]: I0227 01:02:04.695242 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerID="a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a" exitCode=0 Feb 27 01:02:04 crc kubenswrapper[4781]: I0227 01:02:04.695327 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerDied","Data":"a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a"} Feb 27 01:02:04 crc kubenswrapper[4781]: I0227 01:02:04.699101 4781 generic.go:334] "Generic (PLEG): container finished" podID="da201e7f-72da-4998-8ecb-98a8814f423d" containerID="cba817e11e179b47fa5e55d89f7bb6242121790488edf6a29e663a57c82230bd" exitCode=0 Feb 27 01:02:04 crc kubenswrapper[4781]: I0227 01:02:04.699146 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" event={"ID":"da201e7f-72da-4998-8ecb-98a8814f423d","Type":"ContainerDied","Data":"cba817e11e179b47fa5e55d89f7bb6242121790488edf6a29e663a57c82230bd"} Feb 27 01:02:05 crc kubenswrapper[4781]: I0227 01:02:05.310401 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:02:05 crc kubenswrapper[4781]: E0227 01:02:05.310670 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.168509 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.252889 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-682f8\" (UniqueName: \"kubernetes.io/projected/da201e7f-72da-4998-8ecb-98a8814f423d-kube-api-access-682f8\") pod \"da201e7f-72da-4998-8ecb-98a8814f423d\" (UID: \"da201e7f-72da-4998-8ecb-98a8814f423d\") " Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.267987 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da201e7f-72da-4998-8ecb-98a8814f423d-kube-api-access-682f8" (OuterVolumeSpecName: "kube-api-access-682f8") pod "da201e7f-72da-4998-8ecb-98a8814f423d" (UID: "da201e7f-72da-4998-8ecb-98a8814f423d"). InnerVolumeSpecName "kube-api-access-682f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.355844 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-682f8\" (UniqueName: \"kubernetes.io/projected/da201e7f-72da-4998-8ecb-98a8814f423d-kube-api-access-682f8\") on node \"crc\" DevicePath \"\"" Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.723491 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" event={"ID":"da201e7f-72da-4998-8ecb-98a8814f423d","Type":"ContainerDied","Data":"ba1dabaf3614d6322961d2b5cde702b8395ef307cf5a984baf680198a1f73dc4"} Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.723563 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba1dabaf3614d6322961d2b5cde702b8395ef307cf5a984baf680198a1f73dc4" Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.723522 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.726621 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerStarted","Data":"0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56"} Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.747337 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jdd4z" podStartSLOduration=2.658468756 podStartE2EDuration="8.747302957s" podCreationTimestamp="2026-02-27 01:01:58 +0000 UTC" firstStartedPulling="2026-02-27 01:01:59.640948349 +0000 UTC m=+3388.898487893" lastFinishedPulling="2026-02-27 01:02:05.72978254 +0000 UTC m=+3394.987322094" observedRunningTime="2026-02-27 01:02:06.746359422 +0000 UTC m=+3396.003898976" watchObservedRunningTime="2026-02-27 01:02:06.747302957 +0000 UTC m=+3396.004842511" Feb 27 01:02:07 crc kubenswrapper[4781]: I0227 01:02:07.243475 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535896-46l2w"] Feb 27 01:02:07 crc kubenswrapper[4781]: I0227 01:02:07.256509 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535896-46l2w"] Feb 27 01:02:07 crc kubenswrapper[4781]: I0227 01:02:07.320524 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c6b6160-b122-4248-b7ed-a206d3bc633e" path="/var/lib/kubelet/pods/4c6b6160-b122-4248-b7ed-a206d3bc633e/volumes" Feb 27 01:02:08 crc kubenswrapper[4781]: I0227 01:02:08.584060 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:02:08 crc kubenswrapper[4781]: I0227 01:02:08.584124 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:02:08 crc kubenswrapper[4781]: I0227 01:02:08.638253 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:02:17 crc kubenswrapper[4781]: I0227 01:02:17.309249 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:02:17 crc kubenswrapper[4781]: E0227 01:02:17.309914 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:02:18 crc kubenswrapper[4781]: I0227 01:02:18.646984 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:02:18 crc kubenswrapper[4781]: I0227 01:02:18.705338 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jdd4z"] Feb 27 01:02:18 crc kubenswrapper[4781]: I0227 01:02:18.849689 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jdd4z" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="registry-server" containerID="cri-o://0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56" gracePeriod=2 Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.503965 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.663176 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-catalog-content\") pod \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.663294 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bsth\" (UniqueName: \"kubernetes.io/projected/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-kube-api-access-2bsth\") pod \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.663457 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-utilities\") pod \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.664425 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-utilities" (OuterVolumeSpecName: "utilities") pod "3f253a7c-3c8a-48a7-a91f-676dd51d64bf" (UID: "3f253a7c-3c8a-48a7-a91f-676dd51d64bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.669152 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-kube-api-access-2bsth" (OuterVolumeSpecName: "kube-api-access-2bsth") pod "3f253a7c-3c8a-48a7-a91f-676dd51d64bf" (UID: "3f253a7c-3c8a-48a7-a91f-676dd51d64bf"). InnerVolumeSpecName "kube-api-access-2bsth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.708148 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f253a7c-3c8a-48a7-a91f-676dd51d64bf" (UID: "3f253a7c-3c8a-48a7-a91f-676dd51d64bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.766246 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bsth\" (UniqueName: \"kubernetes.io/projected/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-kube-api-access-2bsth\") on node \"crc\" DevicePath \"\"" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.766278 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.766288 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.860903 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerID="0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56" exitCode=0 Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.860950 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerDied","Data":"0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56"} Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.860980 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerDied","Data":"df273d841b3449944d268e56a0e28f11d82f21d7f09bba2a5b92b1c4ad1eb152"} Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.860957 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.860999 4781 scope.go:117] "RemoveContainer" containerID="0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.889544 4781 scope.go:117] "RemoveContainer" containerID="a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.901534 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jdd4z"] Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.911596 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jdd4z"] Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.916748 4781 scope.go:117] "RemoveContainer" containerID="026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.960064 4781 scope.go:117] "RemoveContainer" containerID="0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56" Feb 27 01:02:19 crc kubenswrapper[4781]: E0227 01:02:19.960611 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56\": container with ID starting with 0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56 not found: ID does not exist" containerID="0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.960677 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56"} err="failed to get container status \"0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56\": rpc error: code = NotFound desc = could not find container \"0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56\": container with ID starting with 0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56 not found: ID does not exist" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.960715 4781 scope.go:117] "RemoveContainer" containerID="a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a" Feb 27 01:02:19 crc kubenswrapper[4781]: E0227 01:02:19.961132 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a\": container with ID starting with a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a not found: ID does not exist" containerID="a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.961222 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a"} err="failed to get container status \"a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a\": rpc error: code = NotFound desc = could not find container \"a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a\": container with ID starting with a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a not found: ID does not exist" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.961292 4781 scope.go:117] "RemoveContainer" containerID="026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808" Feb 27 01:02:19 crc kubenswrapper[4781]: E0227 01:02:19.961587 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808\": container with ID starting with 026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808 not found: ID does not exist" containerID="026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.961613 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808"} err="failed to get container status \"026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808\": rpc error: code = NotFound desc = could not find container \"026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808\": container with ID starting with 026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808 not found: ID does not exist" Feb 27 01:02:21 crc kubenswrapper[4781]: I0227 01:02:21.323159 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" path="/var/lib/kubelet/pods/3f253a7c-3c8a-48a7-a91f-676dd51d64bf/volumes" Feb 27 01:02:29 crc kubenswrapper[4781]: I0227 01:02:29.311802 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:02:29 crc kubenswrapper[4781]: E0227 01:02:29.314701 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:02:32 crc kubenswrapper[4781]: I0227 01:02:32.414395 4781 scope.go:117] "RemoveContainer" containerID="e8e722eebfb284cc61eb30213644cd7eb1815f8a77725715668d5116c8a7d0d7" Feb 27 01:02:42 crc kubenswrapper[4781]: I0227 01:02:42.310784 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:02:42 crc kubenswrapper[4781]: E0227 01:02:42.311381 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:02:55 crc kubenswrapper[4781]: I0227 01:02:55.310313 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:02:55 crc kubenswrapper[4781]: E0227 01:02:55.311856 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.983131 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kdj4m"] Feb 27 01:03:06 crc kubenswrapper[4781]: E0227 01:03:06.984222 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da201e7f-72da-4998-8ecb-98a8814f423d" containerName="oc" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.984238 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="da201e7f-72da-4998-8ecb-98a8814f423d" containerName="oc" Feb 27 01:03:06 crc kubenswrapper[4781]: E0227 01:03:06.984269 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="extract-content" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.984278 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="extract-content" Feb 27 01:03:06 crc kubenswrapper[4781]: E0227 01:03:06.984294 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="registry-server" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.984301 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="registry-server" Feb 27 01:03:06 crc kubenswrapper[4781]: E0227 01:03:06.984332 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="extract-utilities" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.984339 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="extract-utilities" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.984560 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="registry-server" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.984583 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="da201e7f-72da-4998-8ecb-98a8814f423d" containerName="oc" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.986598 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.999484 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdj4m"] Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.100419 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-utilities\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.100574 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-catalog-content\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.100656 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8q49\" (UniqueName: \"kubernetes.io/projected/265ff10b-4377-40a2-af31-17901aa730b7-kube-api-access-t8q49\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.203901 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-utilities\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.204043 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-catalog-content\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.204091 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8q49\" (UniqueName: \"kubernetes.io/projected/265ff10b-4377-40a2-af31-17901aa730b7-kube-api-access-t8q49\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.204603 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-utilities\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.204709 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-catalog-content\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.226913 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8q49\" (UniqueName: \"kubernetes.io/projected/265ff10b-4377-40a2-af31-17901aa730b7-kube-api-access-t8q49\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.310277 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:03:07 crc kubenswrapper[4781]: E0227 01:03:07.310598 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.318688 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.915301 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdj4m"] Feb 27 01:03:08 crc kubenswrapper[4781]: I0227 01:03:08.325974 4781 generic.go:334] "Generic (PLEG): container finished" podID="265ff10b-4377-40a2-af31-17901aa730b7" containerID="786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae" exitCode=0 Feb 27 01:03:08 crc kubenswrapper[4781]: I0227 01:03:08.326027 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerDied","Data":"786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae"} Feb 27 01:03:08 crc kubenswrapper[4781]: I0227 01:03:08.326064 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerStarted","Data":"2198efaa880f708ce961cdfd28d8b51bfa029ee0f97064cd5f306fafb07ec0f2"} Feb 27 01:03:09 crc kubenswrapper[4781]: I0227 01:03:09.338743 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerStarted","Data":"bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd"} Feb 27 01:03:11 crc kubenswrapper[4781]: I0227 01:03:11.363480 4781 generic.go:334] "Generic (PLEG): container finished" podID="265ff10b-4377-40a2-af31-17901aa730b7" containerID="bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd" exitCode=0 Feb 27 01:03:11 crc kubenswrapper[4781]: I0227 01:03:11.363575 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerDied","Data":"bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd"} Feb 27 01:03:13 crc kubenswrapper[4781]: I0227 01:03:13.390488 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerStarted","Data":"1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e"} Feb 27 01:03:13 crc kubenswrapper[4781]: I0227 01:03:13.416468 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kdj4m" podStartSLOduration=3.559889456 podStartE2EDuration="7.416370924s" podCreationTimestamp="2026-02-27 01:03:06 +0000 UTC" firstStartedPulling="2026-02-27 01:03:08.328574633 +0000 UTC m=+3457.586114187" lastFinishedPulling="2026-02-27 01:03:12.185056101 +0000 UTC m=+3461.442595655" observedRunningTime="2026-02-27 01:03:13.407830118 +0000 UTC m=+3462.665369702" watchObservedRunningTime="2026-02-27 01:03:13.416370924 +0000 UTC m=+3462.673910488" Feb 27 01:03:17 crc kubenswrapper[4781]: I0227 01:03:17.325693 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:17 crc kubenswrapper[4781]: I0227 01:03:17.326571 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:17 crc kubenswrapper[4781]: I0227 01:03:17.375989 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:17 crc kubenswrapper[4781]: I0227 01:03:17.481686 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:17 crc kubenswrapper[4781]: I0227 01:03:17.617961 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdj4m"] Feb 27 01:03:19 crc kubenswrapper[4781]: I0227 01:03:19.446936 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kdj4m" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="registry-server" containerID="cri-o://1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e" gracePeriod=2 Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.026997 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.097703 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-catalog-content\") pod \"265ff10b-4377-40a2-af31-17901aa730b7\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.097777 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8q49\" (UniqueName: \"kubernetes.io/projected/265ff10b-4377-40a2-af31-17901aa730b7-kube-api-access-t8q49\") pod \"265ff10b-4377-40a2-af31-17901aa730b7\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.098044 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-utilities\") pod \"265ff10b-4377-40a2-af31-17901aa730b7\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.099352 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-utilities" (OuterVolumeSpecName: "utilities") pod "265ff10b-4377-40a2-af31-17901aa730b7" (UID: "265ff10b-4377-40a2-af31-17901aa730b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.115038 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265ff10b-4377-40a2-af31-17901aa730b7-kube-api-access-t8q49" (OuterVolumeSpecName: "kube-api-access-t8q49") pod "265ff10b-4377-40a2-af31-17901aa730b7" (UID: "265ff10b-4377-40a2-af31-17901aa730b7"). InnerVolumeSpecName "kube-api-access-t8q49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.205443 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.205528 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8q49\" (UniqueName: \"kubernetes.io/projected/265ff10b-4377-40a2-af31-17901aa730b7-kube-api-access-t8q49\") on node \"crc\" DevicePath \"\"" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.283748 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "265ff10b-4377-40a2-af31-17901aa730b7" (UID: "265ff10b-4377-40a2-af31-17901aa730b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.307868 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.459989 4781 generic.go:334] "Generic (PLEG): container finished" podID="265ff10b-4377-40a2-af31-17901aa730b7" containerID="1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e" exitCode=0 Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.460063 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerDied","Data":"1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e"} Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.460121 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerDied","Data":"2198efaa880f708ce961cdfd28d8b51bfa029ee0f97064cd5f306fafb07ec0f2"} Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.460173 4781 scope.go:117] "RemoveContainer" containerID="1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.460198 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.482862 4781 scope.go:117] "RemoveContainer" containerID="bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.513072 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdj4m"] Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.515359 4781 scope.go:117] "RemoveContainer" containerID="786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.526990 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kdj4m"] Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.567138 4781 scope.go:117] "RemoveContainer" containerID="1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e" Feb 27 01:03:20 crc kubenswrapper[4781]: E0227 01:03:20.567811 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e\": container with ID starting with 1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e not found: ID does not exist" containerID="1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.567904 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e"} err="failed to get container status \"1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e\": rpc error: code = NotFound desc = could not find container \"1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e\": container with ID starting with 1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e not found: ID does not exist" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.567941 4781 scope.go:117] "RemoveContainer" containerID="bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd" Feb 27 01:03:20 crc kubenswrapper[4781]: E0227 01:03:20.568869 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd\": container with ID starting with bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd not found: ID does not exist" containerID="bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.568915 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd"} err="failed to get container status \"bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd\": rpc error: code = NotFound desc = could not find container \"bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd\": container with ID starting with bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd not found: ID does not exist" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.568947 4781 scope.go:117] "RemoveContainer" containerID="786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae" Feb 27 01:03:20 crc kubenswrapper[4781]: E0227 01:03:20.569297 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae\": container with ID starting with 786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae not found: ID does not exist" containerID="786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.569372 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae"} err="failed to get container status \"786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae\": rpc error: code = NotFound desc = could not find container \"786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae\": container with ID starting with 786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae not found: ID does not exist" Feb 27 01:03:21 crc kubenswrapper[4781]: I0227 01:03:21.321964 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="265ff10b-4377-40a2-af31-17901aa730b7" path="/var/lib/kubelet/pods/265ff10b-4377-40a2-af31-17901aa730b7/volumes" Feb 27 01:03:22 crc kubenswrapper[4781]: I0227 01:03:22.309783 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:03:22 crc kubenswrapper[4781]: E0227 01:03:22.310309 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:03:37 crc kubenswrapper[4781]: I0227 01:03:37.310328 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:03:37 crc kubenswrapper[4781]: E0227 01:03:37.311285 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:03:48 crc kubenswrapper[4781]: I0227 01:03:48.309753 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:03:48 crc kubenswrapper[4781]: I0227 01:03:48.729970 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"19e186a1d63090ffda2bb27999feb897d50891041c0f8dac4c6ddf6ef96ddf91"} Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.152816 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535904-lpvnq"] Feb 27 01:04:00 crc kubenswrapper[4781]: E0227 01:04:00.153779 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="extract-content" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.153798 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="extract-content" Feb 27 01:04:00 crc kubenswrapper[4781]: E0227 01:04:00.153819 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="registry-server" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.153827 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="registry-server" Feb 27 01:04:00 crc kubenswrapper[4781]: E0227 01:04:00.153867 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="extract-utilities" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.153876 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="extract-utilities" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.154176 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="registry-server" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.155353 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.157547 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.158198 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.158967 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.165750 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535904-lpvnq"] Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.265797 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq4rg\" (UniqueName: \"kubernetes.io/projected/f8015524-a32f-427b-a5a9-08f1d2257259-kube-api-access-pq4rg\") pod \"auto-csr-approver-29535904-lpvnq\" (UID: \"f8015524-a32f-427b-a5a9-08f1d2257259\") " pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.368317 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq4rg\" (UniqueName: \"kubernetes.io/projected/f8015524-a32f-427b-a5a9-08f1d2257259-kube-api-access-pq4rg\") pod \"auto-csr-approver-29535904-lpvnq\" (UID: \"f8015524-a32f-427b-a5a9-08f1d2257259\") " pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.391975 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq4rg\" (UniqueName: \"kubernetes.io/projected/f8015524-a32f-427b-a5a9-08f1d2257259-kube-api-access-pq4rg\") pod \"auto-csr-approver-29535904-lpvnq\" (UID: \"f8015524-a32f-427b-a5a9-08f1d2257259\") " pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.480206 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.986532 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535904-lpvnq"] Feb 27 01:04:01 crc kubenswrapper[4781]: I0227 01:04:01.860612 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" event={"ID":"f8015524-a32f-427b-a5a9-08f1d2257259","Type":"ContainerStarted","Data":"bbfa9c982760cf326a62e9c25967f50c4731c438113a0c7dd6b72a5030919178"} Feb 27 01:04:02 crc kubenswrapper[4781]: I0227 01:04:02.871667 4781 generic.go:334] "Generic (PLEG): container finished" podID="f8015524-a32f-427b-a5a9-08f1d2257259" containerID="7727ecd4b6ab2c57f71f74adfa530ee79124f2b2f80dab2ef9d287684b1949a8" exitCode=0 Feb 27 01:04:02 crc kubenswrapper[4781]: I0227 01:04:02.871746 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" event={"ID":"f8015524-a32f-427b-a5a9-08f1d2257259","Type":"ContainerDied","Data":"7727ecd4b6ab2c57f71f74adfa530ee79124f2b2f80dab2ef9d287684b1949a8"} Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.306667 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.349477 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq4rg\" (UniqueName: \"kubernetes.io/projected/f8015524-a32f-427b-a5a9-08f1d2257259-kube-api-access-pq4rg\") pod \"f8015524-a32f-427b-a5a9-08f1d2257259\" (UID: \"f8015524-a32f-427b-a5a9-08f1d2257259\") " Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.361826 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8015524-a32f-427b-a5a9-08f1d2257259-kube-api-access-pq4rg" (OuterVolumeSpecName: "kube-api-access-pq4rg") pod "f8015524-a32f-427b-a5a9-08f1d2257259" (UID: "f8015524-a32f-427b-a5a9-08f1d2257259"). InnerVolumeSpecName "kube-api-access-pq4rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.452268 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq4rg\" (UniqueName: \"kubernetes.io/projected/f8015524-a32f-427b-a5a9-08f1d2257259-kube-api-access-pq4rg\") on node \"crc\" DevicePath \"\"" Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.892031 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" event={"ID":"f8015524-a32f-427b-a5a9-08f1d2257259","Type":"ContainerDied","Data":"bbfa9c982760cf326a62e9c25967f50c4731c438113a0c7dd6b72a5030919178"} Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.892085 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbfa9c982760cf326a62e9c25967f50c4731c438113a0c7dd6b72a5030919178" Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.892150 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:05 crc kubenswrapper[4781]: I0227 01:04:05.376746 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535898-vpdkx"] Feb 27 01:04:05 crc kubenswrapper[4781]: I0227 01:04:05.387861 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535898-vpdkx"] Feb 27 01:04:07 crc kubenswrapper[4781]: I0227 01:04:07.321482 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b518ad5e-0994-4767-9c6d-d2ca11998a43" path="/var/lib/kubelet/pods/b518ad5e-0994-4767-9c6d-d2ca11998a43/volumes" Feb 27 01:04:32 crc kubenswrapper[4781]: I0227 01:04:32.567060 4781 scope.go:117] "RemoveContainer" containerID="5f9790a75567a30dcdf46b8e6f6e9baff3953d885f3c6f58834afe7ab39768fd" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.149262 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535906-d594j"] Feb 27 01:06:00 crc kubenswrapper[4781]: E0227 01:06:00.150304 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8015524-a32f-427b-a5a9-08f1d2257259" containerName="oc" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.150320 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8015524-a32f-427b-a5a9-08f1d2257259" containerName="oc" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.150567 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8015524-a32f-427b-a5a9-08f1d2257259" containerName="oc" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.151355 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.156174 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.156219 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.160912 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.164495 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535906-d594j"] Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.283822 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl5jt\" (UniqueName: \"kubernetes.io/projected/dab64a02-9142-4b6f-95c2-1e3805ef62fc-kube-api-access-cl5jt\") pod \"auto-csr-approver-29535906-d594j\" (UID: \"dab64a02-9142-4b6f-95c2-1e3805ef62fc\") " pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.386301 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl5jt\" (UniqueName: \"kubernetes.io/projected/dab64a02-9142-4b6f-95c2-1e3805ef62fc-kube-api-access-cl5jt\") pod \"auto-csr-approver-29535906-d594j\" (UID: \"dab64a02-9142-4b6f-95c2-1e3805ef62fc\") " pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.405680 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl5jt\" (UniqueName: \"kubernetes.io/projected/dab64a02-9142-4b6f-95c2-1e3805ef62fc-kube-api-access-cl5jt\") pod \"auto-csr-approver-29535906-d594j\" (UID: \"dab64a02-9142-4b6f-95c2-1e3805ef62fc\") " pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.471083 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.943031 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535906-d594j"] Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.947843 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:06:01 crc kubenswrapper[4781]: I0227 01:06:01.223739 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535906-d594j" event={"ID":"dab64a02-9142-4b6f-95c2-1e3805ef62fc","Type":"ContainerStarted","Data":"b674b21584f06bb732d8c99e018808b988bcda4f3eab8d21a9b2f326fbb0e016"} Feb 27 01:06:02 crc kubenswrapper[4781]: I0227 01:06:02.236085 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535906-d594j" event={"ID":"dab64a02-9142-4b6f-95c2-1e3805ef62fc","Type":"ContainerStarted","Data":"b05e61a8466110a32ab8e96fdf9a1fec0c346bbcf2b136cb6d58c69fbbfe2a41"} Feb 27 01:06:02 crc kubenswrapper[4781]: I0227 01:06:02.255821 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535906-d594j" podStartSLOduration=1.314092769 podStartE2EDuration="2.255802838s" podCreationTimestamp="2026-02-27 01:06:00 +0000 UTC" firstStartedPulling="2026-02-27 01:06:00.947540685 +0000 UTC m=+3630.205080239" lastFinishedPulling="2026-02-27 01:06:01.889250754 +0000 UTC m=+3631.146790308" observedRunningTime="2026-02-27 01:06:02.248088594 +0000 UTC m=+3631.505628148" watchObservedRunningTime="2026-02-27 01:06:02.255802838 +0000 UTC m=+3631.513342392" Feb 27 01:06:03 crc kubenswrapper[4781]: I0227 01:06:03.249592 4781 generic.go:334] "Generic (PLEG): container finished" podID="dab64a02-9142-4b6f-95c2-1e3805ef62fc" containerID="b05e61a8466110a32ab8e96fdf9a1fec0c346bbcf2b136cb6d58c69fbbfe2a41" exitCode=0 Feb 27 01:06:03 crc kubenswrapper[4781]: I0227 01:06:03.249667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535906-d594j" event={"ID":"dab64a02-9142-4b6f-95c2-1e3805ef62fc","Type":"ContainerDied","Data":"b05e61a8466110a32ab8e96fdf9a1fec0c346bbcf2b136cb6d58c69fbbfe2a41"} Feb 27 01:06:04 crc kubenswrapper[4781]: I0227 01:06:04.682798 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:04 crc kubenswrapper[4781]: I0227 01:06:04.773469 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl5jt\" (UniqueName: \"kubernetes.io/projected/dab64a02-9142-4b6f-95c2-1e3805ef62fc-kube-api-access-cl5jt\") pod \"dab64a02-9142-4b6f-95c2-1e3805ef62fc\" (UID: \"dab64a02-9142-4b6f-95c2-1e3805ef62fc\") " Feb 27 01:06:04 crc kubenswrapper[4781]: I0227 01:06:04.780899 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab64a02-9142-4b6f-95c2-1e3805ef62fc-kube-api-access-cl5jt" (OuterVolumeSpecName: "kube-api-access-cl5jt") pod "dab64a02-9142-4b6f-95c2-1e3805ef62fc" (UID: "dab64a02-9142-4b6f-95c2-1e3805ef62fc"). InnerVolumeSpecName "kube-api-access-cl5jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:04 crc kubenswrapper[4781]: I0227 01:06:04.876237 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl5jt\" (UniqueName: \"kubernetes.io/projected/dab64a02-9142-4b6f-95c2-1e3805ef62fc-kube-api-access-cl5jt\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:05 crc kubenswrapper[4781]: I0227 01:06:05.269394 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535906-d594j" event={"ID":"dab64a02-9142-4b6f-95c2-1e3805ef62fc","Type":"ContainerDied","Data":"b674b21584f06bb732d8c99e018808b988bcda4f3eab8d21a9b2f326fbb0e016"} Feb 27 01:06:05 crc kubenswrapper[4781]: I0227 01:06:05.269680 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b674b21584f06bb732d8c99e018808b988bcda4f3eab8d21a9b2f326fbb0e016" Feb 27 01:06:05 crc kubenswrapper[4781]: I0227 01:06:05.269688 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:05 crc kubenswrapper[4781]: I0227 01:06:05.747363 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535900-6n6zr"] Feb 27 01:06:05 crc kubenswrapper[4781]: I0227 01:06:05.756784 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535900-6n6zr"] Feb 27 01:06:07 crc kubenswrapper[4781]: I0227 01:06:07.322106 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aac78d6-5f5c-4b48-95f2-554353abcdd3" path="/var/lib/kubelet/pods/0aac78d6-5f5c-4b48-95f2-554353abcdd3/volumes" Feb 27 01:06:12 crc kubenswrapper[4781]: I0227 01:06:12.895793 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:06:12 crc kubenswrapper[4781]: I0227 01:06:12.896348 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:06:32 crc kubenswrapper[4781]: I0227 01:06:32.664931 4781 scope.go:117] "RemoveContainer" containerID="fb76bcf8730e0171831c959b0a00779c7b469f5264f4c1f6152625c6f8db5a04" Feb 27 01:06:42 crc kubenswrapper[4781]: I0227 01:06:42.896001 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:06:42 crc kubenswrapper[4781]: I0227 01:06:42.896549 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:07:12 crc kubenswrapper[4781]: I0227 01:07:12.896057 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:07:12 crc kubenswrapper[4781]: I0227 01:07:12.896750 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:07:12 crc kubenswrapper[4781]: I0227 01:07:12.896810 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 01:07:12 crc kubenswrapper[4781]: I0227 01:07:12.897713 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19e186a1d63090ffda2bb27999feb897d50891041c0f8dac4c6ddf6ef96ddf91"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:07:12 crc kubenswrapper[4781]: I0227 01:07:12.897775 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://19e186a1d63090ffda2bb27999feb897d50891041c0f8dac4c6ddf6ef96ddf91" gracePeriod=600 Feb 27 01:07:13 crc kubenswrapper[4781]: I0227 01:07:13.975278 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="19e186a1d63090ffda2bb27999feb897d50891041c0f8dac4c6ddf6ef96ddf91" exitCode=0 Feb 27 01:07:13 crc kubenswrapper[4781]: I0227 01:07:13.975368 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"19e186a1d63090ffda2bb27999feb897d50891041c0f8dac4c6ddf6ef96ddf91"} Feb 27 01:07:13 crc kubenswrapper[4781]: I0227 01:07:13.975906 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5"} Feb 27 01:07:13 crc kubenswrapper[4781]: I0227 01:07:13.975936 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.588682 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 27 01:07:46 crc kubenswrapper[4781]: E0227 01:07:46.590498 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab64a02-9142-4b6f-95c2-1e3805ef62fc" containerName="oc" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.590522 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab64a02-9142-4b6f-95c2-1e3805ef62fc" containerName="oc" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.590847 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab64a02-9142-4b6f-95c2-1e3805ef62fc" containerName="oc" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.592117 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.595282 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5s299" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.595444 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.595451 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.600839 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.605551 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.628958 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629021 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629050 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629085 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-config-data\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629110 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629447 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629587 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj679\" (UniqueName: \"kubernetes.io/projected/2cc23bf5-7773-4d33-b2be-2ee2a807f086-kube-api-access-dj679\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629619 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629890 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.731532 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj679\" (UniqueName: \"kubernetes.io/projected/2cc23bf5-7773-4d33-b2be-2ee2a807f086-kube-api-access-dj679\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.731930 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732148 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732337 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732450 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732552 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732674 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-config-data\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732782 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732940 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732492 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.733033 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.733615 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.733780 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.734178 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-config-data\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.739235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.739734 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.741142 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.751979 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj679\" (UniqueName: \"kubernetes.io/projected/2cc23bf5-7773-4d33-b2be-2ee2a807f086-kube-api-access-dj679\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.762862 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.925350 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 01:07:47 crc kubenswrapper[4781]: I0227 01:07:47.390794 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 27 01:07:48 crc kubenswrapper[4781]: I0227 01:07:48.306816 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2cc23bf5-7773-4d33-b2be-2ee2a807f086","Type":"ContainerStarted","Data":"09459f242ec2925373f69aa651b16dcc96301f46d456e4eb0a8a401a4473bde9"} Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.161741 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535908-lzshf"] Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.164096 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.167413 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.167740 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.167877 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.177260 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535908-lzshf"] Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.333505 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx5f5\" (UniqueName: \"kubernetes.io/projected/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0-kube-api-access-gx5f5\") pod \"auto-csr-approver-29535908-lzshf\" (UID: \"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0\") " pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.436928 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx5f5\" (UniqueName: \"kubernetes.io/projected/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0-kube-api-access-gx5f5\") pod \"auto-csr-approver-29535908-lzshf\" (UID: \"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0\") " pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.459257 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx5f5\" (UniqueName: \"kubernetes.io/projected/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0-kube-api-access-gx5f5\") pod \"auto-csr-approver-29535908-lzshf\" (UID: \"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0\") " pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.493759 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:17 crc kubenswrapper[4781]: E0227 01:08:17.112000 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 27 01:08:17 crc kubenswrapper[4781]: E0227 01:08:17.113691 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dj679,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(2cc23bf5-7773-4d33-b2be-2ee2a807f086): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 01:08:17 crc kubenswrapper[4781]: E0227 01:08:17.115772 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="2cc23bf5-7773-4d33-b2be-2ee2a807f086" Feb 27 01:08:17 crc kubenswrapper[4781]: E0227 01:08:17.647925 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="2cc23bf5-7773-4d33-b2be-2ee2a807f086" Feb 27 01:08:17 crc kubenswrapper[4781]: I0227 01:08:17.655034 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535908-lzshf"] Feb 27 01:08:18 crc kubenswrapper[4781]: I0227 01:08:18.656947 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535908-lzshf" event={"ID":"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0","Type":"ContainerStarted","Data":"10a940d58a4c252f52d537dcb72a4d88359f6729136d8f994c4322e0b99dc05f"} Feb 27 01:08:19 crc kubenswrapper[4781]: I0227 01:08:19.666810 4781 generic.go:334] "Generic (PLEG): container finished" podID="f6cd4500-04f9-471d-8c27-2ce1b03fa4f0" containerID="d86455502c8fe2209abff00ea2ac33cb262fd9455f065e8361be2d1baaf2ea79" exitCode=0 Feb 27 01:08:19 crc kubenswrapper[4781]: I0227 01:08:19.666918 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535908-lzshf" event={"ID":"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0","Type":"ContainerDied","Data":"d86455502c8fe2209abff00ea2ac33cb262fd9455f065e8361be2d1baaf2ea79"} Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.083470 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.198549 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx5f5\" (UniqueName: \"kubernetes.io/projected/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0-kube-api-access-gx5f5\") pod \"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0\" (UID: \"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0\") " Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.205600 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0-kube-api-access-gx5f5" (OuterVolumeSpecName: "kube-api-access-gx5f5") pod "f6cd4500-04f9-471d-8c27-2ce1b03fa4f0" (UID: "f6cd4500-04f9-471d-8c27-2ce1b03fa4f0"). InnerVolumeSpecName "kube-api-access-gx5f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.302293 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx5f5\" (UniqueName: \"kubernetes.io/projected/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0-kube-api-access-gx5f5\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.688478 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535908-lzshf" event={"ID":"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0","Type":"ContainerDied","Data":"10a940d58a4c252f52d537dcb72a4d88359f6729136d8f994c4322e0b99dc05f"} Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.688518 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10a940d58a4c252f52d537dcb72a4d88359f6729136d8f994c4322e0b99dc05f" Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.688565 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:22 crc kubenswrapper[4781]: I0227 01:08:22.156995 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535902-bkgwj"] Feb 27 01:08:22 crc kubenswrapper[4781]: I0227 01:08:22.166580 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535902-bkgwj"] Feb 27 01:08:23 crc kubenswrapper[4781]: I0227 01:08:23.323717 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da201e7f-72da-4998-8ecb-98a8814f423d" path="/var/lib/kubelet/pods/da201e7f-72da-4998-8ecb-98a8814f423d/volumes" Feb 27 01:08:31 crc kubenswrapper[4781]: I0227 01:08:31.745381 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 27 01:08:32 crc kubenswrapper[4781]: I0227 01:08:32.770588 4781 scope.go:117] "RemoveContainer" containerID="cba817e11e179b47fa5e55d89f7bb6242121790488edf6a29e663a57c82230bd" Feb 27 01:08:32 crc kubenswrapper[4781]: I0227 01:08:32.813587 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2cc23bf5-7773-4d33-b2be-2ee2a807f086","Type":"ContainerStarted","Data":"4ec0cfbe0f662afc3fb53d5da9b369a462851688dbd0c754fa273d9d0f52d0e5"} Feb 27 01:08:32 crc kubenswrapper[4781]: I0227 01:08:32.841126 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.495126001 podStartE2EDuration="47.841101987s" podCreationTimestamp="2026-02-27 01:07:45 +0000 UTC" firstStartedPulling="2026-02-27 01:07:47.395739584 +0000 UTC m=+3736.653279138" lastFinishedPulling="2026-02-27 01:08:31.74171557 +0000 UTC m=+3780.999255124" observedRunningTime="2026-02-27 01:08:32.839206277 +0000 UTC m=+3782.096745841" watchObservedRunningTime="2026-02-27 01:08:32.841101987 +0000 UTC m=+3782.098641561" Feb 27 01:09:42 crc kubenswrapper[4781]: I0227 01:09:42.895055 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:09:42 crc kubenswrapper[4781]: I0227 01:09:42.895799 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.161968 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535910-zxgrs"] Feb 27 01:10:00 crc kubenswrapper[4781]: E0227 01:10:00.163133 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6cd4500-04f9-471d-8c27-2ce1b03fa4f0" containerName="oc" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.163149 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cd4500-04f9-471d-8c27-2ce1b03fa4f0" containerName="oc" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.163415 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6cd4500-04f9-471d-8c27-2ce1b03fa4f0" containerName="oc" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.164516 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.167387 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.167724 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.168144 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.175572 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535910-zxgrs"] Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.265885 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsqk8\" (UniqueName: \"kubernetes.io/projected/b5688def-e560-413e-8be5-1d2cfd7e7b4b-kube-api-access-rsqk8\") pod \"auto-csr-approver-29535910-zxgrs\" (UID: \"b5688def-e560-413e-8be5-1d2cfd7e7b4b\") " pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.368687 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsqk8\" (UniqueName: \"kubernetes.io/projected/b5688def-e560-413e-8be5-1d2cfd7e7b4b-kube-api-access-rsqk8\") pod \"auto-csr-approver-29535910-zxgrs\" (UID: \"b5688def-e560-413e-8be5-1d2cfd7e7b4b\") " pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.398025 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsqk8\" (UniqueName: \"kubernetes.io/projected/b5688def-e560-413e-8be5-1d2cfd7e7b4b-kube-api-access-rsqk8\") pod \"auto-csr-approver-29535910-zxgrs\" (UID: \"b5688def-e560-413e-8be5-1d2cfd7e7b4b\") " pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.489032 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.992755 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535910-zxgrs"] Feb 27 01:10:01 crc kubenswrapper[4781]: I0227 01:10:01.646773 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" event={"ID":"b5688def-e560-413e-8be5-1d2cfd7e7b4b","Type":"ContainerStarted","Data":"6b0241584b0c5ee639c2e96362f32b966983ca1ca8d046d5c36ce5fbdc167f06"} Feb 27 01:10:03 crc kubenswrapper[4781]: I0227 01:10:03.670578 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" event={"ID":"b5688def-e560-413e-8be5-1d2cfd7e7b4b","Type":"ContainerStarted","Data":"2f2421387f96858e89c61569a502259afb51c7ee81cb327e3f4310b20461360e"} Feb 27 01:10:03 crc kubenswrapper[4781]: I0227 01:10:03.697442 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" podStartSLOduration=2.101965146 podStartE2EDuration="3.697423411s" podCreationTimestamp="2026-02-27 01:10:00 +0000 UTC" firstStartedPulling="2026-02-27 01:10:00.994507304 +0000 UTC m=+3870.252046858" lastFinishedPulling="2026-02-27 01:10:02.589965569 +0000 UTC m=+3871.847505123" observedRunningTime="2026-02-27 01:10:03.686985704 +0000 UTC m=+3872.944525248" watchObservedRunningTime="2026-02-27 01:10:03.697423411 +0000 UTC m=+3872.954962965" Feb 27 01:10:04 crc kubenswrapper[4781]: I0227 01:10:04.682659 4781 generic.go:334] "Generic (PLEG): container finished" podID="b5688def-e560-413e-8be5-1d2cfd7e7b4b" containerID="2f2421387f96858e89c61569a502259afb51c7ee81cb327e3f4310b20461360e" exitCode=0 Feb 27 01:10:04 crc kubenswrapper[4781]: I0227 01:10:04.682858 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" event={"ID":"b5688def-e560-413e-8be5-1d2cfd7e7b4b","Type":"ContainerDied","Data":"2f2421387f96858e89c61569a502259afb51c7ee81cb327e3f4310b20461360e"} Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.352263 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.502363 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsqk8\" (UniqueName: \"kubernetes.io/projected/b5688def-e560-413e-8be5-1d2cfd7e7b4b-kube-api-access-rsqk8\") pod \"b5688def-e560-413e-8be5-1d2cfd7e7b4b\" (UID: \"b5688def-e560-413e-8be5-1d2cfd7e7b4b\") " Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.508725 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5688def-e560-413e-8be5-1d2cfd7e7b4b-kube-api-access-rsqk8" (OuterVolumeSpecName: "kube-api-access-rsqk8") pod "b5688def-e560-413e-8be5-1d2cfd7e7b4b" (UID: "b5688def-e560-413e-8be5-1d2cfd7e7b4b"). InnerVolumeSpecName "kube-api-access-rsqk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.604840 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsqk8\" (UniqueName: \"kubernetes.io/projected/b5688def-e560-413e-8be5-1d2cfd7e7b4b-kube-api-access-rsqk8\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.705307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" event={"ID":"b5688def-e560-413e-8be5-1d2cfd7e7b4b","Type":"ContainerDied","Data":"6b0241584b0c5ee639c2e96362f32b966983ca1ca8d046d5c36ce5fbdc167f06"} Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.705546 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b0241584b0c5ee639c2e96362f32b966983ca1ca8d046d5c36ce5fbdc167f06" Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.705543 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.775192 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535904-lpvnq"] Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.784348 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535904-lpvnq"] Feb 27 01:10:07 crc kubenswrapper[4781]: I0227 01:10:07.325529 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8015524-a32f-427b-a5a9-08f1d2257259" path="/var/lib/kubelet/pods/f8015524-a32f-427b-a5a9-08f1d2257259/volumes" Feb 27 01:10:12 crc kubenswrapper[4781]: I0227 01:10:12.895902 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:10:12 crc kubenswrapper[4781]: I0227 01:10:12.896404 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:10:32 crc kubenswrapper[4781]: I0227 01:10:32.897119 4781 scope.go:117] "RemoveContainer" containerID="7727ecd4b6ab2c57f71f74adfa530ee79124f2b2f80dab2ef9d287684b1949a8" Feb 27 01:10:42 crc kubenswrapper[4781]: I0227 01:10:42.896237 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:10:42 crc kubenswrapper[4781]: I0227 01:10:42.896898 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:10:42 crc kubenswrapper[4781]: I0227 01:10:42.896952 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 01:10:42 crc kubenswrapper[4781]: I0227 01:10:42.897908 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:10:42 crc kubenswrapper[4781]: I0227 01:10:42.897971 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" gracePeriod=600 Feb 27 01:10:43 crc kubenswrapper[4781]: E0227 01:10:43.022675 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:10:43 crc kubenswrapper[4781]: I0227 01:10:43.054293 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" exitCode=0 Feb 27 01:10:43 crc kubenswrapper[4781]: I0227 01:10:43.054376 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5"} Feb 27 01:10:43 crc kubenswrapper[4781]: I0227 01:10:43.054445 4781 scope.go:117] "RemoveContainer" containerID="19e186a1d63090ffda2bb27999feb897d50891041c0f8dac4c6ddf6ef96ddf91" Feb 27 01:10:43 crc kubenswrapper[4781]: I0227 01:10:43.055238 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:10:43 crc kubenswrapper[4781]: E0227 01:10:43.055542 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:10:55 crc kubenswrapper[4781]: I0227 01:10:55.315852 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:10:55 crc kubenswrapper[4781]: E0227 01:10:55.316613 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:11:07 crc kubenswrapper[4781]: I0227 01:11:07.310142 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:11:07 crc kubenswrapper[4781]: E0227 01:11:07.311077 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.195656 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ggngc"] Feb 27 01:11:18 crc kubenswrapper[4781]: E0227 01:11:18.196937 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5688def-e560-413e-8be5-1d2cfd7e7b4b" containerName="oc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.196954 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5688def-e560-413e-8be5-1d2cfd7e7b4b" containerName="oc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.197197 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5688def-e560-413e-8be5-1d2cfd7e7b4b" containerName="oc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.199030 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.206453 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ggngc"] Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.254049 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-catalog-content\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.254509 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-utilities\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.254616 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmjx4\" (UniqueName: \"kubernetes.io/projected/9823bd09-bd8f-4565-8437-90af124c41f3-kube-api-access-zmjx4\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.310950 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:11:18 crc kubenswrapper[4781]: E0227 01:11:18.311510 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.356258 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-utilities\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.356576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmjx4\" (UniqueName: \"kubernetes.io/projected/9823bd09-bd8f-4565-8437-90af124c41f3-kube-api-access-zmjx4\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.356759 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-catalog-content\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.356856 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-utilities\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.357140 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-catalog-content\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.376689 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmjx4\" (UniqueName: \"kubernetes.io/projected/9823bd09-bd8f-4565-8437-90af124c41f3-kube-api-access-zmjx4\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.534266 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:19 crc kubenswrapper[4781]: I0227 01:11:19.044303 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ggngc"] Feb 27 01:11:19 crc kubenswrapper[4781]: I0227 01:11:19.420131 4781 generic.go:334] "Generic (PLEG): container finished" podID="9823bd09-bd8f-4565-8437-90af124c41f3" containerID="cd87c8220ae390b7d57fc9d6d38a9e53b68e245d14b3b9b15d787a819cfb9cd2" exitCode=0 Feb 27 01:11:19 crc kubenswrapper[4781]: I0227 01:11:19.420229 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerDied","Data":"cd87c8220ae390b7d57fc9d6d38a9e53b68e245d14b3b9b15d787a819cfb9cd2"} Feb 27 01:11:19 crc kubenswrapper[4781]: I0227 01:11:19.420463 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerStarted","Data":"5d9f1a5f9bea6c2522bc7d5eae4266a7c1dd56573912e3c0e29be855c9bd30fa"} Feb 27 01:11:19 crc kubenswrapper[4781]: I0227 01:11:19.421881 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:11:20 crc kubenswrapper[4781]: I0227 01:11:20.432103 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerStarted","Data":"5953fda994cd7ac67e94d751a7a40e33b594ebe74d39bf82a391b26ac6914867"} Feb 27 01:11:26 crc kubenswrapper[4781]: I0227 01:11:26.621354 4781 generic.go:334] "Generic (PLEG): container finished" podID="9823bd09-bd8f-4565-8437-90af124c41f3" containerID="5953fda994cd7ac67e94d751a7a40e33b594ebe74d39bf82a391b26ac6914867" exitCode=0 Feb 27 01:11:26 crc kubenswrapper[4781]: I0227 01:11:26.621463 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerDied","Data":"5953fda994cd7ac67e94d751a7a40e33b594ebe74d39bf82a391b26ac6914867"} Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.170818 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmpc"] Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.173602 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.202678 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmpc"] Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.234781 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-utilities\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.234859 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-catalog-content\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.234894 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgxc7\" (UniqueName: \"kubernetes.io/projected/858cd97e-43e3-45ce-be89-d6da5a51aac7-kube-api-access-vgxc7\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.337244 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-utilities\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.337337 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-catalog-content\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.337380 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgxc7\" (UniqueName: \"kubernetes.io/projected/858cd97e-43e3-45ce-be89-d6da5a51aac7-kube-api-access-vgxc7\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.339381 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-catalog-content\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.339471 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-utilities\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.379312 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgxc7\" (UniqueName: \"kubernetes.io/projected/858cd97e-43e3-45ce-be89-d6da5a51aac7-kube-api-access-vgxc7\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.498887 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.642989 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerStarted","Data":"5401633aabeaa5cbe1470b20fd3bcb61c3185700eaa796901f5a2bf04d10b7fd"} Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.672053 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ggngc" podStartSLOduration=2.095321715 podStartE2EDuration="9.671617633s" podCreationTimestamp="2026-02-27 01:11:18 +0000 UTC" firstStartedPulling="2026-02-27 01:11:19.421687611 +0000 UTC m=+3948.679227165" lastFinishedPulling="2026-02-27 01:11:26.997983529 +0000 UTC m=+3956.255523083" observedRunningTime="2026-02-27 01:11:27.663178978 +0000 UTC m=+3956.920718532" watchObservedRunningTime="2026-02-27 01:11:27.671617633 +0000 UTC m=+3956.929157197" Feb 27 01:11:28 crc kubenswrapper[4781]: I0227 01:11:28.006269 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmpc"] Feb 27 01:11:28 crc kubenswrapper[4781]: W0227 01:11:28.008951 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod858cd97e_43e3_45ce_be89_d6da5a51aac7.slice/crio-40372cb860e2866b26af4ce398fbba0bee8e1e438dae95e85433fd83b1b549fe WatchSource:0}: Error finding container 40372cb860e2866b26af4ce398fbba0bee8e1e438dae95e85433fd83b1b549fe: Status 404 returned error can't find the container with id 40372cb860e2866b26af4ce398fbba0bee8e1e438dae95e85433fd83b1b549fe Feb 27 01:11:28 crc kubenswrapper[4781]: I0227 01:11:28.535107 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:28 crc kubenswrapper[4781]: I0227 01:11:28.535472 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:28 crc kubenswrapper[4781]: I0227 01:11:28.653247 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerStarted","Data":"2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb"} Feb 27 01:11:28 crc kubenswrapper[4781]: I0227 01:11:28.653304 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerStarted","Data":"40372cb860e2866b26af4ce398fbba0bee8e1e438dae95e85433fd83b1b549fe"} Feb 27 01:11:29 crc kubenswrapper[4781]: I0227 01:11:29.309663 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:11:29 crc kubenswrapper[4781]: E0227 01:11:29.310231 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:11:29 crc kubenswrapper[4781]: I0227 01:11:29.582874 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ggngc" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="registry-server" probeResult="failure" output=< Feb 27 01:11:29 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 01:11:29 crc kubenswrapper[4781]: > Feb 27 01:11:29 crc kubenswrapper[4781]: I0227 01:11:29.669400 4781 generic.go:334] "Generic (PLEG): container finished" podID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerID="2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb" exitCode=0 Feb 27 01:11:29 crc kubenswrapper[4781]: I0227 01:11:29.669445 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerDied","Data":"2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb"} Feb 27 01:11:30 crc kubenswrapper[4781]: I0227 01:11:30.680774 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerStarted","Data":"f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69"} Feb 27 01:11:32 crc kubenswrapper[4781]: I0227 01:11:32.704798 4781 generic.go:334] "Generic (PLEG): container finished" podID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerID="f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69" exitCode=0 Feb 27 01:11:32 crc kubenswrapper[4781]: I0227 01:11:32.704903 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerDied","Data":"f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69"} Feb 27 01:11:33 crc kubenswrapper[4781]: I0227 01:11:33.726775 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerStarted","Data":"b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8"} Feb 27 01:11:33 crc kubenswrapper[4781]: I0227 01:11:33.786723 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ncmpc" podStartSLOduration=3.347529622 podStartE2EDuration="6.786702413s" podCreationTimestamp="2026-02-27 01:11:27 +0000 UTC" firstStartedPulling="2026-02-27 01:11:29.672002658 +0000 UTC m=+3958.929542212" lastFinishedPulling="2026-02-27 01:11:33.111175449 +0000 UTC m=+3962.368715003" observedRunningTime="2026-02-27 01:11:33.779385428 +0000 UTC m=+3963.036924972" watchObservedRunningTime="2026-02-27 01:11:33.786702413 +0000 UTC m=+3963.044241967" Feb 27 01:11:37 crc kubenswrapper[4781]: I0227 01:11:37.499219 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:37 crc kubenswrapper[4781]: I0227 01:11:37.499868 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:38 crc kubenswrapper[4781]: I0227 01:11:38.562862 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-ncmpc" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="registry-server" probeResult="failure" output=< Feb 27 01:11:38 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 01:11:38 crc kubenswrapper[4781]: > Feb 27 01:11:39 crc kubenswrapper[4781]: I0227 01:11:39.588921 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ggngc" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="registry-server" probeResult="failure" output=< Feb 27 01:11:39 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 01:11:39 crc kubenswrapper[4781]: > Feb 27 01:11:43 crc kubenswrapper[4781]: I0227 01:11:43.312456 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:11:43 crc kubenswrapper[4781]: E0227 01:11:43.314297 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:11:47 crc kubenswrapper[4781]: I0227 01:11:47.549554 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:47 crc kubenswrapper[4781]: I0227 01:11:47.604342 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:47 crc kubenswrapper[4781]: I0227 01:11:47.812018 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmpc"] Feb 27 01:11:48 crc kubenswrapper[4781]: I0227 01:11:48.588724 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:48 crc kubenswrapper[4781]: I0227 01:11:48.640391 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:48 crc kubenswrapper[4781]: I0227 01:11:48.929325 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ncmpc" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="registry-server" containerID="cri-o://b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8" gracePeriod=2 Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.725121 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.885956 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-utilities\") pod \"858cd97e-43e3-45ce-be89-d6da5a51aac7\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.886471 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgxc7\" (UniqueName: \"kubernetes.io/projected/858cd97e-43e3-45ce-be89-d6da5a51aac7-kube-api-access-vgxc7\") pod \"858cd97e-43e3-45ce-be89-d6da5a51aac7\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.886508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-catalog-content\") pod \"858cd97e-43e3-45ce-be89-d6da5a51aac7\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.886554 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-utilities" (OuterVolumeSpecName: "utilities") pod "858cd97e-43e3-45ce-be89-d6da5a51aac7" (UID: "858cd97e-43e3-45ce-be89-d6da5a51aac7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.887216 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.892905 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/858cd97e-43e3-45ce-be89-d6da5a51aac7-kube-api-access-vgxc7" (OuterVolumeSpecName: "kube-api-access-vgxc7") pod "858cd97e-43e3-45ce-be89-d6da5a51aac7" (UID: "858cd97e-43e3-45ce-be89-d6da5a51aac7"). InnerVolumeSpecName "kube-api-access-vgxc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.912215 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "858cd97e-43e3-45ce-be89-d6da5a51aac7" (UID: "858cd97e-43e3-45ce-be89-d6da5a51aac7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.944214 4781 generic.go:334] "Generic (PLEG): container finished" podID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerID="b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8" exitCode=0 Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.944291 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerDied","Data":"b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8"} Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.944333 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerDied","Data":"40372cb860e2866b26af4ce398fbba0bee8e1e438dae95e85433fd83b1b549fe"} Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.944356 4781 scope.go:117] "RemoveContainer" containerID="b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.944597 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.980044 4781 scope.go:117] "RemoveContainer" containerID="f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.989894 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgxc7\" (UniqueName: \"kubernetes.io/projected/858cd97e-43e3-45ce-be89-d6da5a51aac7-kube-api-access-vgxc7\") on node \"crc\" DevicePath \"\"" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.989923 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.991988 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmpc"] Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.005542 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmpc"] Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.021580 4781 scope.go:117] "RemoveContainer" containerID="2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.057967 4781 scope.go:117] "RemoveContainer" containerID="b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8" Feb 27 01:11:50 crc kubenswrapper[4781]: E0227 01:11:50.058521 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8\": container with ID starting with b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8 not found: ID does not exist" containerID="b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.058552 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8"} err="failed to get container status \"b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8\": rpc error: code = NotFound desc = could not find container \"b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8\": container with ID starting with b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8 not found: ID does not exist" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.058591 4781 scope.go:117] "RemoveContainer" containerID="f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69" Feb 27 01:11:50 crc kubenswrapper[4781]: E0227 01:11:50.059020 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69\": container with ID starting with f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69 not found: ID does not exist" containerID="f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.059047 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69"} err="failed to get container status \"f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69\": rpc error: code = NotFound desc = could not find container \"f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69\": container with ID starting with f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69 not found: ID does not exist" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.059060 4781 scope.go:117] "RemoveContainer" containerID="2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb" Feb 27 01:11:50 crc kubenswrapper[4781]: E0227 01:11:50.059385 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb\": container with ID starting with 2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb not found: ID does not exist" containerID="2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.059425 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb"} err="failed to get container status \"2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb\": rpc error: code = NotFound desc = could not find container \"2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb\": container with ID starting with 2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb not found: ID does not exist" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.398071 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ggngc"] Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.398710 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ggngc" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="registry-server" containerID="cri-o://5401633aabeaa5cbe1470b20fd3bcb61c3185700eaa796901f5a2bf04d10b7fd" gracePeriod=2 Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.961153 4781 generic.go:334] "Generic (PLEG): container finished" podID="9823bd09-bd8f-4565-8437-90af124c41f3" containerID="5401633aabeaa5cbe1470b20fd3bcb61c3185700eaa796901f5a2bf04d10b7fd" exitCode=0 Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.961208 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerDied","Data":"5401633aabeaa5cbe1470b20fd3bcb61c3185700eaa796901f5a2bf04d10b7fd"} Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.151112 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.219006 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-catalog-content\") pod \"9823bd09-bd8f-4565-8437-90af124c41f3\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.219087 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-utilities\") pod \"9823bd09-bd8f-4565-8437-90af124c41f3\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.219328 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmjx4\" (UniqueName: \"kubernetes.io/projected/9823bd09-bd8f-4565-8437-90af124c41f3-kube-api-access-zmjx4\") pod \"9823bd09-bd8f-4565-8437-90af124c41f3\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.219805 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-utilities" (OuterVolumeSpecName: "utilities") pod "9823bd09-bd8f-4565-8437-90af124c41f3" (UID: "9823bd09-bd8f-4565-8437-90af124c41f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.220357 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.224896 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9823bd09-bd8f-4565-8437-90af124c41f3-kube-api-access-zmjx4" (OuterVolumeSpecName: "kube-api-access-zmjx4") pod "9823bd09-bd8f-4565-8437-90af124c41f3" (UID: "9823bd09-bd8f-4565-8437-90af124c41f3"). InnerVolumeSpecName "kube-api-access-zmjx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.322535 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmjx4\" (UniqueName: \"kubernetes.io/projected/9823bd09-bd8f-4565-8437-90af124c41f3-kube-api-access-zmjx4\") on node \"crc\" DevicePath \"\"" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.325893 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" path="/var/lib/kubelet/pods/858cd97e-43e3-45ce-be89-d6da5a51aac7/volumes" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.383126 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9823bd09-bd8f-4565-8437-90af124c41f3" (UID: "9823bd09-bd8f-4565-8437-90af124c41f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.424987 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.977748 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerDied","Data":"5d9f1a5f9bea6c2522bc7d5eae4266a7c1dd56573912e3c0e29be855c9bd30fa"} Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.977804 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.977828 4781 scope.go:117] "RemoveContainer" containerID="5401633aabeaa5cbe1470b20fd3bcb61c3185700eaa796901f5a2bf04d10b7fd" Feb 27 01:11:52 crc kubenswrapper[4781]: I0227 01:11:52.015092 4781 scope.go:117] "RemoveContainer" containerID="5953fda994cd7ac67e94d751a7a40e33b594ebe74d39bf82a391b26ac6914867" Feb 27 01:11:52 crc kubenswrapper[4781]: I0227 01:11:52.017527 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ggngc"] Feb 27 01:11:52 crc kubenswrapper[4781]: I0227 01:11:52.035431 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ggngc"] Feb 27 01:11:52 crc kubenswrapper[4781]: I0227 01:11:52.054860 4781 scope.go:117] "RemoveContainer" containerID="cd87c8220ae390b7d57fc9d6d38a9e53b68e245d14b3b9b15d787a819cfb9cd2" Feb 27 01:11:53 crc kubenswrapper[4781]: I0227 01:11:53.322668 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" path="/var/lib/kubelet/pods/9823bd09-bd8f-4565-8437-90af124c41f3/volumes" Feb 27 01:11:55 crc kubenswrapper[4781]: I0227 01:11:55.309612 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:11:55 crc kubenswrapper[4781]: E0227 01:11:55.310586 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.150531 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535912-gtvv9"] Feb 27 01:12:00 crc kubenswrapper[4781]: E0227 01:12:00.151513 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="extract-utilities" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151526 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="extract-utilities" Feb 27 01:12:00 crc kubenswrapper[4781]: E0227 01:12:00.151548 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="registry-server" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151556 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="registry-server" Feb 27 01:12:00 crc kubenswrapper[4781]: E0227 01:12:00.151569 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="extract-utilities" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151575 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="extract-utilities" Feb 27 01:12:00 crc kubenswrapper[4781]: E0227 01:12:00.151585 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="extract-content" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151591 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="extract-content" Feb 27 01:12:00 crc kubenswrapper[4781]: E0227 01:12:00.151601 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="registry-server" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151607 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="registry-server" Feb 27 01:12:00 crc kubenswrapper[4781]: E0227 01:12:00.151651 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="extract-content" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151657 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="extract-content" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151858 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="registry-server" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151880 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="registry-server" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.152884 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.156290 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.156562 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.156752 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.165205 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535912-gtvv9"] Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.207802 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4dd8\" (UniqueName: \"kubernetes.io/projected/95d6c94d-6b4e-4d64-8c67-eb43c03187c2-kube-api-access-n4dd8\") pod \"auto-csr-approver-29535912-gtvv9\" (UID: \"95d6c94d-6b4e-4d64-8c67-eb43c03187c2\") " pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.310418 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4dd8\" (UniqueName: \"kubernetes.io/projected/95d6c94d-6b4e-4d64-8c67-eb43c03187c2-kube-api-access-n4dd8\") pod \"auto-csr-approver-29535912-gtvv9\" (UID: \"95d6c94d-6b4e-4d64-8c67-eb43c03187c2\") " pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.336372 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4dd8\" (UniqueName: \"kubernetes.io/projected/95d6c94d-6b4e-4d64-8c67-eb43c03187c2-kube-api-access-n4dd8\") pod \"auto-csr-approver-29535912-gtvv9\" (UID: \"95d6c94d-6b4e-4d64-8c67-eb43c03187c2\") " pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.475020 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:01 crc kubenswrapper[4781]: I0227 01:12:01.028781 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535912-gtvv9"] Feb 27 01:12:01 crc kubenswrapper[4781]: I0227 01:12:01.063484 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" event={"ID":"95d6c94d-6b4e-4d64-8c67-eb43c03187c2","Type":"ContainerStarted","Data":"ea75dc0bb1b7c654c05cb33d20795b28e260254c83b72d580f113ad5a3a0caaa"} Feb 27 01:12:03 crc kubenswrapper[4781]: I0227 01:12:03.083257 4781 generic.go:334] "Generic (PLEG): container finished" podID="95d6c94d-6b4e-4d64-8c67-eb43c03187c2" containerID="8f787ca4f347bb157c6f5d9ee468bbb739868634c8f4daa10b685f41a5344282" exitCode=0 Feb 27 01:12:03 crc kubenswrapper[4781]: I0227 01:12:03.083331 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" event={"ID":"95d6c94d-6b4e-4d64-8c67-eb43c03187c2","Type":"ContainerDied","Data":"8f787ca4f347bb157c6f5d9ee468bbb739868634c8f4daa10b685f41a5344282"} Feb 27 01:12:04 crc kubenswrapper[4781]: I0227 01:12:04.809880 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:04 crc kubenswrapper[4781]: I0227 01:12:04.883336 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4dd8\" (UniqueName: \"kubernetes.io/projected/95d6c94d-6b4e-4d64-8c67-eb43c03187c2-kube-api-access-n4dd8\") pod \"95d6c94d-6b4e-4d64-8c67-eb43c03187c2\" (UID: \"95d6c94d-6b4e-4d64-8c67-eb43c03187c2\") " Feb 27 01:12:04 crc kubenswrapper[4781]: I0227 01:12:04.893162 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d6c94d-6b4e-4d64-8c67-eb43c03187c2-kube-api-access-n4dd8" (OuterVolumeSpecName: "kube-api-access-n4dd8") pod "95d6c94d-6b4e-4d64-8c67-eb43c03187c2" (UID: "95d6c94d-6b4e-4d64-8c67-eb43c03187c2"). InnerVolumeSpecName "kube-api-access-n4dd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:12:04 crc kubenswrapper[4781]: I0227 01:12:04.987975 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4dd8\" (UniqueName: \"kubernetes.io/projected/95d6c94d-6b4e-4d64-8c67-eb43c03187c2-kube-api-access-n4dd8\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:05 crc kubenswrapper[4781]: I0227 01:12:05.133918 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" event={"ID":"95d6c94d-6b4e-4d64-8c67-eb43c03187c2","Type":"ContainerDied","Data":"ea75dc0bb1b7c654c05cb33d20795b28e260254c83b72d580f113ad5a3a0caaa"} Feb 27 01:12:05 crc kubenswrapper[4781]: I0227 01:12:05.133958 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea75dc0bb1b7c654c05cb33d20795b28e260254c83b72d580f113ad5a3a0caaa" Feb 27 01:12:05 crc kubenswrapper[4781]: I0227 01:12:05.134010 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:05 crc kubenswrapper[4781]: I0227 01:12:05.917994 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535906-d594j"] Feb 27 01:12:05 crc kubenswrapper[4781]: I0227 01:12:05.926828 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535906-d594j"] Feb 27 01:12:07 crc kubenswrapper[4781]: I0227 01:12:07.323271 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab64a02-9142-4b6f-95c2-1e3805ef62fc" path="/var/lib/kubelet/pods/dab64a02-9142-4b6f-95c2-1e3805ef62fc/volumes" Feb 27 01:12:08 crc kubenswrapper[4781]: I0227 01:12:08.310165 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:12:08 crc kubenswrapper[4781]: E0227 01:12:08.310533 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:12:22 crc kubenswrapper[4781]: I0227 01:12:22.309556 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:12:22 crc kubenswrapper[4781]: E0227 01:12:22.311480 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:12:32 crc kubenswrapper[4781]: I0227 01:12:32.998012 4781 scope.go:117] "RemoveContainer" containerID="b05e61a8466110a32ab8e96fdf9a1fec0c346bbcf2b136cb6d58c69fbbfe2a41" Feb 27 01:12:37 crc kubenswrapper[4781]: I0227 01:12:37.309978 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:12:37 crc kubenswrapper[4781]: E0227 01:12:37.311334 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:12:50 crc kubenswrapper[4781]: I0227 01:12:50.309702 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:12:50 crc kubenswrapper[4781]: E0227 01:12:50.310571 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:13:01 crc kubenswrapper[4781]: I0227 01:13:01.319433 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:13:01 crc kubenswrapper[4781]: E0227 01:13:01.320286 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.382533 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sjrw7"] Feb 27 01:13:02 crc kubenswrapper[4781]: E0227 01:13:02.383100 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d6c94d-6b4e-4d64-8c67-eb43c03187c2" containerName="oc" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.383115 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d6c94d-6b4e-4d64-8c67-eb43c03187c2" containerName="oc" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.383373 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d6c94d-6b4e-4d64-8c67-eb43c03187c2" containerName="oc" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.385612 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.395171 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjrw7"] Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.550439 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqv4r\" (UniqueName: \"kubernetes.io/projected/d4577ec6-c8bb-4b50-912b-59bedb35c38b-kube-api-access-lqv4r\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.551171 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-utilities\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.551303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-catalog-content\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.653442 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqv4r\" (UniqueName: \"kubernetes.io/projected/d4577ec6-c8bb-4b50-912b-59bedb35c38b-kube-api-access-lqv4r\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.653968 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-utilities\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.654105 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-catalog-content\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.654550 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-utilities\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.654711 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-catalog-content\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.677649 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqv4r\" (UniqueName: \"kubernetes.io/projected/d4577ec6-c8bb-4b50-912b-59bedb35c38b-kube-api-access-lqv4r\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.716504 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:03 crc kubenswrapper[4781]: I0227 01:13:03.308485 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjrw7"] Feb 27 01:13:03 crc kubenswrapper[4781]: I0227 01:13:03.847898 4781 generic.go:334] "Generic (PLEG): container finished" podID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerID="547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93" exitCode=0 Feb 27 01:13:03 crc kubenswrapper[4781]: I0227 01:13:03.847955 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerDied","Data":"547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93"} Feb 27 01:13:03 crc kubenswrapper[4781]: I0227 01:13:03.848384 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerStarted","Data":"7e0a68824449b3bff65757090cd8e9a85ee8a8d6a48a7613ac67a8bd344a423f"} Feb 27 01:13:04 crc kubenswrapper[4781]: I0227 01:13:04.859610 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerStarted","Data":"ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9"} Feb 27 01:13:06 crc kubenswrapper[4781]: E0227 01:13:06.167039 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4577ec6_c8bb_4b50_912b_59bedb35c38b.slice/crio-conmon-ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9.scope\": RecentStats: unable to find data in memory cache]" Feb 27 01:13:06 crc kubenswrapper[4781]: I0227 01:13:06.877775 4781 generic.go:334] "Generic (PLEG): container finished" podID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerID="ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9" exitCode=0 Feb 27 01:13:06 crc kubenswrapper[4781]: I0227 01:13:06.877970 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerDied","Data":"ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9"} Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.379458 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-574hs"] Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.382068 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.393542 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-574hs"] Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.556719 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9vnf\" (UniqueName: \"kubernetes.io/projected/57fe38df-608c-474e-b91f-4d744a0cb01f-kube-api-access-k9vnf\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.556892 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-utilities\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.557353 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-catalog-content\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.659515 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-catalog-content\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.659942 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9vnf\" (UniqueName: \"kubernetes.io/projected/57fe38df-608c-474e-b91f-4d744a0cb01f-kube-api-access-k9vnf\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.660137 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-utilities\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.660449 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-catalog-content\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.660756 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-utilities\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.804199 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9vnf\" (UniqueName: \"kubernetes.io/projected/57fe38df-608c-474e-b91f-4d744a0cb01f-kube-api-access-k9vnf\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:08 crc kubenswrapper[4781]: I0227 01:13:08.022408 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:08 crc kubenswrapper[4781]: I0227 01:13:08.585063 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-574hs"] Feb 27 01:13:08 crc kubenswrapper[4781]: I0227 01:13:08.912286 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerStarted","Data":"75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274"} Feb 27 01:13:08 crc kubenswrapper[4781]: I0227 01:13:08.912333 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerStarted","Data":"9177c2a726ddba9b8c88cf36289009378b4b28e22fc492abae24f8038b4d1db8"} Feb 27 01:13:08 crc kubenswrapper[4781]: I0227 01:13:08.915455 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerStarted","Data":"80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea"} Feb 27 01:13:08 crc kubenswrapper[4781]: I0227 01:13:08.964406 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sjrw7" podStartSLOduration=3.489627225 podStartE2EDuration="6.964381578s" podCreationTimestamp="2026-02-27 01:13:02 +0000 UTC" firstStartedPulling="2026-02-27 01:13:03.850219199 +0000 UTC m=+4053.107758763" lastFinishedPulling="2026-02-27 01:13:07.324973562 +0000 UTC m=+4056.582513116" observedRunningTime="2026-02-27 01:13:08.955321955 +0000 UTC m=+4058.212861529" watchObservedRunningTime="2026-02-27 01:13:08.964381578 +0000 UTC m=+4058.221921132" Feb 27 01:13:09 crc kubenswrapper[4781]: I0227 01:13:09.929273 4781 generic.go:334] "Generic (PLEG): container finished" podID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerID="75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274" exitCode=0 Feb 27 01:13:09 crc kubenswrapper[4781]: I0227 01:13:09.929341 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerDied","Data":"75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274"} Feb 27 01:13:10 crc kubenswrapper[4781]: I0227 01:13:10.941513 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerStarted","Data":"db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5"} Feb 27 01:13:12 crc kubenswrapper[4781]: I0227 01:13:12.717773 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:12 crc kubenswrapper[4781]: I0227 01:13:12.718140 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:12 crc kubenswrapper[4781]: I0227 01:13:12.768985 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:13 crc kubenswrapper[4781]: I0227 01:13:13.971816 4781 generic.go:334] "Generic (PLEG): container finished" podID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerID="db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5" exitCode=0 Feb 27 01:13:13 crc kubenswrapper[4781]: I0227 01:13:13.971897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerDied","Data":"db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5"} Feb 27 01:13:14 crc kubenswrapper[4781]: I0227 01:13:14.309495 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:13:14 crc kubenswrapper[4781]: E0227 01:13:14.309970 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:13:14 crc kubenswrapper[4781]: I0227 01:13:14.984048 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerStarted","Data":"2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e"} Feb 27 01:13:15 crc kubenswrapper[4781]: I0227 01:13:15.006437 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-574hs" podStartSLOduration=3.49755621 podStartE2EDuration="8.006420502s" podCreationTimestamp="2026-02-27 01:13:07 +0000 UTC" firstStartedPulling="2026-02-27 01:13:09.931434143 +0000 UTC m=+4059.188973697" lastFinishedPulling="2026-02-27 01:13:14.440298435 +0000 UTC m=+4063.697837989" observedRunningTime="2026-02-27 01:13:15.001257924 +0000 UTC m=+4064.258797488" watchObservedRunningTime="2026-02-27 01:13:15.006420502 +0000 UTC m=+4064.263960056" Feb 27 01:13:18 crc kubenswrapper[4781]: I0227 01:13:18.023606 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:18 crc kubenswrapper[4781]: I0227 01:13:18.024278 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:18 crc kubenswrapper[4781]: I0227 01:13:18.124803 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:19 crc kubenswrapper[4781]: I0227 01:13:19.071167 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:19 crc kubenswrapper[4781]: I0227 01:13:19.120412 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-574hs"] Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.035911 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-574hs" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="registry-server" containerID="cri-o://2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e" gracePeriod=2 Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.824413 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.976136 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9vnf\" (UniqueName: \"kubernetes.io/projected/57fe38df-608c-474e-b91f-4d744a0cb01f-kube-api-access-k9vnf\") pod \"57fe38df-608c-474e-b91f-4d744a0cb01f\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.976189 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-utilities\") pod \"57fe38df-608c-474e-b91f-4d744a0cb01f\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.977098 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-utilities" (OuterVolumeSpecName: "utilities") pod "57fe38df-608c-474e-b91f-4d744a0cb01f" (UID: "57fe38df-608c-474e-b91f-4d744a0cb01f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.977359 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-catalog-content\") pod \"57fe38df-608c-474e-b91f-4d744a0cb01f\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.978571 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.981709 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fe38df-608c-474e-b91f-4d744a0cb01f-kube-api-access-k9vnf" (OuterVolumeSpecName: "kube-api-access-k9vnf") pod "57fe38df-608c-474e-b91f-4d744a0cb01f" (UID: "57fe38df-608c-474e-b91f-4d744a0cb01f"). InnerVolumeSpecName "kube-api-access-k9vnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.031070 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57fe38df-608c-474e-b91f-4d744a0cb01f" (UID: "57fe38df-608c-474e-b91f-4d744a0cb01f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.048714 4781 generic.go:334] "Generic (PLEG): container finished" podID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerID="2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e" exitCode=0 Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.048766 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerDied","Data":"2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e"} Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.048800 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerDied","Data":"9177c2a726ddba9b8c88cf36289009378b4b28e22fc492abae24f8038b4d1db8"} Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.048816 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.048823 4781 scope.go:117] "RemoveContainer" containerID="2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.080679 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.080721 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9vnf\" (UniqueName: \"kubernetes.io/projected/57fe38df-608c-474e-b91f-4d744a0cb01f-kube-api-access-k9vnf\") on node \"crc\" DevicePath \"\"" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.082028 4781 scope.go:117] "RemoveContainer" containerID="db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.090597 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-574hs"] Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.102029 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-574hs"] Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.114355 4781 scope.go:117] "RemoveContainer" containerID="75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.164446 4781 scope.go:117] "RemoveContainer" containerID="2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e" Feb 27 01:13:22 crc kubenswrapper[4781]: E0227 01:13:22.166230 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e\": container with ID starting with 2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e not found: ID does not exist" containerID="2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.166273 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e"} err="failed to get container status \"2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e\": rpc error: code = NotFound desc = could not find container \"2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e\": container with ID starting with 2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e not found: ID does not exist" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.166302 4781 scope.go:117] "RemoveContainer" containerID="db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5" Feb 27 01:13:22 crc kubenswrapper[4781]: E0227 01:13:22.166725 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5\": container with ID starting with db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5 not found: ID does not exist" containerID="db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.166834 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5"} err="failed to get container status \"db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5\": rpc error: code = NotFound desc = could not find container \"db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5\": container with ID starting with db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5 not found: ID does not exist" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.166907 4781 scope.go:117] "RemoveContainer" containerID="75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274" Feb 27 01:13:22 crc kubenswrapper[4781]: E0227 01:13:22.167520 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274\": container with ID starting with 75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274 not found: ID does not exist" containerID="75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.167546 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274"} err="failed to get container status \"75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274\": rpc error: code = NotFound desc = could not find container \"75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274\": container with ID starting with 75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274 not found: ID does not exist" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.764305 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:23 crc kubenswrapper[4781]: I0227 01:13:23.325672 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" path="/var/lib/kubelet/pods/57fe38df-608c-474e-b91f-4d744a0cb01f/volumes" Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.063195 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjrw7"] Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.063470 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sjrw7" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="registry-server" containerID="cri-o://80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea" gracePeriod=2 Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.779105 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.940437 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-catalog-content\") pod \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.940521 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-utilities\") pod \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.940728 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqv4r\" (UniqueName: \"kubernetes.io/projected/d4577ec6-c8bb-4b50-912b-59bedb35c38b-kube-api-access-lqv4r\") pod \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.941405 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-utilities" (OuterVolumeSpecName: "utilities") pod "d4577ec6-c8bb-4b50-912b-59bedb35c38b" (UID: "d4577ec6-c8bb-4b50-912b-59bedb35c38b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.947272 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4577ec6-c8bb-4b50-912b-59bedb35c38b-kube-api-access-lqv4r" (OuterVolumeSpecName: "kube-api-access-lqv4r") pod "d4577ec6-c8bb-4b50-912b-59bedb35c38b" (UID: "d4577ec6-c8bb-4b50-912b-59bedb35c38b"). InnerVolumeSpecName "kube-api-access-lqv4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.996950 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4577ec6-c8bb-4b50-912b-59bedb35c38b" (UID: "d4577ec6-c8bb-4b50-912b-59bedb35c38b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.047350 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.047408 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.047429 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqv4r\" (UniqueName: \"kubernetes.io/projected/d4577ec6-c8bb-4b50-912b-59bedb35c38b-kube-api-access-lqv4r\") on node \"crc\" DevicePath \"\"" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.083180 4781 generic.go:334] "Generic (PLEG): container finished" podID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerID="80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea" exitCode=0 Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.083235 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerDied","Data":"80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea"} Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.083266 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerDied","Data":"7e0a68824449b3bff65757090cd8e9a85ee8a8d6a48a7613ac67a8bd344a423f"} Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.083296 4781 scope.go:117] "RemoveContainer" containerID="80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.083469 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.114101 4781 scope.go:117] "RemoveContainer" containerID="ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.137691 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjrw7"] Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.142002 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sjrw7"] Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.158238 4781 scope.go:117] "RemoveContainer" containerID="547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.197640 4781 scope.go:117] "RemoveContainer" containerID="80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea" Feb 27 01:13:25 crc kubenswrapper[4781]: E0227 01:13:25.198193 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea\": container with ID starting with 80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea not found: ID does not exist" containerID="80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.198236 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea"} err="failed to get container status \"80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea\": rpc error: code = NotFound desc = could not find container \"80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea\": container with ID starting with 80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea not found: ID does not exist" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.198264 4781 scope.go:117] "RemoveContainer" containerID="ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9" Feb 27 01:13:25 crc kubenswrapper[4781]: E0227 01:13:25.198573 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9\": container with ID starting with ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9 not found: ID does not exist" containerID="ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.198600 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9"} err="failed to get container status \"ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9\": rpc error: code = NotFound desc = could not find container \"ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9\": container with ID starting with ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9 not found: ID does not exist" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.198614 4781 scope.go:117] "RemoveContainer" containerID="547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93" Feb 27 01:13:25 crc kubenswrapper[4781]: E0227 01:13:25.199009 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93\": container with ID starting with 547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93 not found: ID does not exist" containerID="547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.199043 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93"} err="failed to get container status \"547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93\": rpc error: code = NotFound desc = could not find container \"547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93\": container with ID starting with 547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93 not found: ID does not exist" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.320333 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" path="/var/lib/kubelet/pods/d4577ec6-c8bb-4b50-912b-59bedb35c38b/volumes" Feb 27 01:13:28 crc kubenswrapper[4781]: I0227 01:13:28.309216 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:13:28 crc kubenswrapper[4781]: E0227 01:13:28.310065 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:13:41 crc kubenswrapper[4781]: I0227 01:13:41.316848 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:13:41 crc kubenswrapper[4781]: E0227 01:13:41.317525 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:13:52 crc kubenswrapper[4781]: I0227 01:13:52.310117 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:13:52 crc kubenswrapper[4781]: E0227 01:13:52.311175 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.177531 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535914-n4hsg"] Feb 27 01:14:00 crc kubenswrapper[4781]: E0227 01:14:00.178337 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="registry-server" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178354 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="registry-server" Feb 27 01:14:00 crc kubenswrapper[4781]: E0227 01:14:00.178386 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="registry-server" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178394 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="registry-server" Feb 27 01:14:00 crc kubenswrapper[4781]: E0227 01:14:00.178410 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="extract-utilities" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178418 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="extract-utilities" Feb 27 01:14:00 crc kubenswrapper[4781]: E0227 01:14:00.178433 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="extract-content" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178439 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="extract-content" Feb 27 01:14:00 crc kubenswrapper[4781]: E0227 01:14:00.178454 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="extract-utilities" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178461 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="extract-utilities" Feb 27 01:14:00 crc kubenswrapper[4781]: E0227 01:14:00.178476 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="extract-content" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178483 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="extract-content" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178716 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="registry-server" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178735 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="registry-server" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.180778 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.187425 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.187542 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.188519 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.207310 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535914-n4hsg"] Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.279102 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzhz7\" (UniqueName: \"kubernetes.io/projected/2afc6c2c-4602-4819-bb62-46008ced90dc-kube-api-access-zzhz7\") pod \"auto-csr-approver-29535914-n4hsg\" (UID: \"2afc6c2c-4602-4819-bb62-46008ced90dc\") " pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.381762 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzhz7\" (UniqueName: \"kubernetes.io/projected/2afc6c2c-4602-4819-bb62-46008ced90dc-kube-api-access-zzhz7\") pod \"auto-csr-approver-29535914-n4hsg\" (UID: \"2afc6c2c-4602-4819-bb62-46008ced90dc\") " pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.402463 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzhz7\" (UniqueName: \"kubernetes.io/projected/2afc6c2c-4602-4819-bb62-46008ced90dc-kube-api-access-zzhz7\") pod \"auto-csr-approver-29535914-n4hsg\" (UID: \"2afc6c2c-4602-4819-bb62-46008ced90dc\") " pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.511108 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.968652 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535914-n4hsg"] Feb 27 01:14:01 crc kubenswrapper[4781]: I0227 01:14:01.446413 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" event={"ID":"2afc6c2c-4602-4819-bb62-46008ced90dc","Type":"ContainerStarted","Data":"75a382a9d1c98e2e384eb6e738f7fc24a346a404652db42c44f0e5096954bfa9"} Feb 27 01:14:03 crc kubenswrapper[4781]: I0227 01:14:03.474171 4781 generic.go:334] "Generic (PLEG): container finished" podID="2afc6c2c-4602-4819-bb62-46008ced90dc" containerID="b2b6fac5723bb6bb5cfc84762685d87a6769151aad24d4f3926a5af565d7efe8" exitCode=0 Feb 27 01:14:03 crc kubenswrapper[4781]: I0227 01:14:03.474418 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" event={"ID":"2afc6c2c-4602-4819-bb62-46008ced90dc","Type":"ContainerDied","Data":"b2b6fac5723bb6bb5cfc84762685d87a6769151aad24d4f3926a5af565d7efe8"} Feb 27 01:14:04 crc kubenswrapper[4781]: I0227 01:14:04.310291 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:14:04 crc kubenswrapper[4781]: E0227 01:14:04.311035 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.101486 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.184849 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzhz7\" (UniqueName: \"kubernetes.io/projected/2afc6c2c-4602-4819-bb62-46008ced90dc-kube-api-access-zzhz7\") pod \"2afc6c2c-4602-4819-bb62-46008ced90dc\" (UID: \"2afc6c2c-4602-4819-bb62-46008ced90dc\") " Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.191715 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2afc6c2c-4602-4819-bb62-46008ced90dc-kube-api-access-zzhz7" (OuterVolumeSpecName: "kube-api-access-zzhz7") pod "2afc6c2c-4602-4819-bb62-46008ced90dc" (UID: "2afc6c2c-4602-4819-bb62-46008ced90dc"). InnerVolumeSpecName "kube-api-access-zzhz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.287482 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzhz7\" (UniqueName: \"kubernetes.io/projected/2afc6c2c-4602-4819-bb62-46008ced90dc-kube-api-access-zzhz7\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.496172 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" event={"ID":"2afc6c2c-4602-4819-bb62-46008ced90dc","Type":"ContainerDied","Data":"75a382a9d1c98e2e384eb6e738f7fc24a346a404652db42c44f0e5096954bfa9"} Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.496215 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75a382a9d1c98e2e384eb6e738f7fc24a346a404652db42c44f0e5096954bfa9" Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.496530 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:06 crc kubenswrapper[4781]: I0227 01:14:06.172151 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535908-lzshf"] Feb 27 01:14:06 crc kubenswrapper[4781]: I0227 01:14:06.183229 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535908-lzshf"] Feb 27 01:14:07 crc kubenswrapper[4781]: I0227 01:14:07.319900 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6cd4500-04f9-471d-8c27-2ce1b03fa4f0" path="/var/lib/kubelet/pods/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0/volumes" Feb 27 01:14:16 crc kubenswrapper[4781]: I0227 01:14:16.309516 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:14:16 crc kubenswrapper[4781]: E0227 01:14:16.310344 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:14:22 crc kubenswrapper[4781]: I0227 01:14:22.673766 4781 generic.go:334] "Generic (PLEG): container finished" podID="2cc23bf5-7773-4d33-b2be-2ee2a807f086" containerID="4ec0cfbe0f662afc3fb53d5da9b369a462851688dbd0c754fa273d9d0f52d0e5" exitCode=0 Feb 27 01:14:22 crc kubenswrapper[4781]: I0227 01:14:22.673867 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2cc23bf5-7773-4d33-b2be-2ee2a807f086","Type":"ContainerDied","Data":"4ec0cfbe0f662afc3fb53d5da9b369a462851688dbd0c754fa273d9d0f52d0e5"} Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.192081 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.294663 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.294749 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj679\" (UniqueName: \"kubernetes.io/projected/2cc23bf5-7773-4d33-b2be-2ee2a807f086-kube-api-access-dj679\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.294912 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-config-data\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.294998 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.295038 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ssh-key\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.295064 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ca-certs\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.295114 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-workdir\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.295195 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-temporary\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.295248 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config-secret\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.295946 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-config-data" (OuterVolumeSpecName: "config-data") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.296320 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.302378 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc23bf5-7773-4d33-b2be-2ee2a807f086-kube-api-access-dj679" (OuterVolumeSpecName: "kube-api-access-dj679") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "kube-api-access-dj679". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.302857 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.329729 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.333779 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.339580 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.356610 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398731 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398779 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398792 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398806 4781 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398852 4781 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398865 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398878 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398890 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj679\" (UniqueName: \"kubernetes.io/projected/2cc23bf5-7773-4d33-b2be-2ee2a807f086-kube-api-access-dj679\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.426203 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.502515 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.695804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2cc23bf5-7773-4d33-b2be-2ee2a807f086","Type":"ContainerDied","Data":"09459f242ec2925373f69aa651b16dcc96301f46d456e4eb0a8a401a4473bde9"} Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.695857 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09459f242ec2925373f69aa651b16dcc96301f46d456e4eb0a8a401a4473bde9" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.695928 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.713366 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.807645 4781 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:30 crc kubenswrapper[4781]: I0227 01:14:30.309425 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:14:30 crc kubenswrapper[4781]: E0227 01:14:30.310336 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:14:33 crc kubenswrapper[4781]: I0227 01:14:33.205652 4781 scope.go:117] "RemoveContainer" containerID="d86455502c8fe2209abff00ea2ac33cb262fd9455f065e8361be2d1baaf2ea79" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.752323 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 27 01:14:36 crc kubenswrapper[4781]: E0227 01:14:36.753436 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc23bf5-7773-4d33-b2be-2ee2a807f086" containerName="tempest-tests-tempest-tests-runner" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.753450 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc23bf5-7773-4d33-b2be-2ee2a807f086" containerName="tempest-tests-tempest-tests-runner" Feb 27 01:14:36 crc kubenswrapper[4781]: E0227 01:14:36.753477 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2afc6c2c-4602-4819-bb62-46008ced90dc" containerName="oc" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.753483 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afc6c2c-4602-4819-bb62-46008ced90dc" containerName="oc" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.753684 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2afc6c2c-4602-4819-bb62-46008ced90dc" containerName="oc" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.753706 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc23bf5-7773-4d33-b2be-2ee2a807f086" containerName="tempest-tests-tempest-tests-runner" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.754450 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.757471 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5s299" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.767608 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.786976 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.787082 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h82gl\" (UniqueName: \"kubernetes.io/projected/083b0010-19f4-4944-a097-96d20dad7eda-kube-api-access-h82gl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.888838 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.888984 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h82gl\" (UniqueName: \"kubernetes.io/projected/083b0010-19f4-4944-a097-96d20dad7eda-kube-api-access-h82gl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.889438 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.909487 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h82gl\" (UniqueName: \"kubernetes.io/projected/083b0010-19f4-4944-a097-96d20dad7eda-kube-api-access-h82gl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.917568 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:37 crc kubenswrapper[4781]: I0227 01:14:37.079330 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:37 crc kubenswrapper[4781]: I0227 01:14:37.525974 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 27 01:14:37 crc kubenswrapper[4781]: I0227 01:14:37.819674 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"083b0010-19f4-4944-a097-96d20dad7eda","Type":"ContainerStarted","Data":"5e5618976697cfd1be3a0195ffb7529857496cb0f3f7c03ed2932f110e6b36be"} Feb 27 01:14:41 crc kubenswrapper[4781]: I0227 01:14:41.856897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"083b0010-19f4-4944-a097-96d20dad7eda","Type":"ContainerStarted","Data":"e12987251b06d55d729c13f02f8757c7e587543cbdff707df8570d08de533609"} Feb 27 01:14:41 crc kubenswrapper[4781]: I0227 01:14:41.875576 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.206598773 podStartE2EDuration="5.875549153s" podCreationTimestamp="2026-02-27 01:14:36 +0000 UTC" firstStartedPulling="2026-02-27 01:14:37.520258359 +0000 UTC m=+4146.777797933" lastFinishedPulling="2026-02-27 01:14:41.189208759 +0000 UTC m=+4150.446748313" observedRunningTime="2026-02-27 01:14:41.86757768 +0000 UTC m=+4151.125117234" watchObservedRunningTime="2026-02-27 01:14:41.875549153 +0000 UTC m=+4151.133088707" Feb 27 01:14:44 crc kubenswrapper[4781]: I0227 01:14:44.309852 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:14:44 crc kubenswrapper[4781]: E0227 01:14:44.310124 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:14:55 crc kubenswrapper[4781]: I0227 01:14:55.309868 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:14:55 crc kubenswrapper[4781]: E0227 01:14:55.310726 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.164020 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj"] Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.165660 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.169910 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.169910 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.190915 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj"] Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.289322 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1f1363d-33b3-4396-b176-66c221518e82-secret-volume\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.289751 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4dhc\" (UniqueName: \"kubernetes.io/projected/d1f1363d-33b3-4396-b176-66c221518e82-kube-api-access-s4dhc\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.289851 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1f1363d-33b3-4396-b176-66c221518e82-config-volume\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.391733 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1f1363d-33b3-4396-b176-66c221518e82-config-volume\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.391898 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1f1363d-33b3-4396-b176-66c221518e82-secret-volume\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.392047 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4dhc\" (UniqueName: \"kubernetes.io/projected/d1f1363d-33b3-4396-b176-66c221518e82-kube-api-access-s4dhc\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.393457 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1f1363d-33b3-4396-b176-66c221518e82-config-volume\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.688152 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4dhc\" (UniqueName: \"kubernetes.io/projected/d1f1363d-33b3-4396-b176-66c221518e82-kube-api-access-s4dhc\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.691063 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1f1363d-33b3-4396-b176-66c221518e82-secret-volume\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.803008 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:01 crc kubenswrapper[4781]: I0227 01:15:01.297038 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj"] Feb 27 01:15:02 crc kubenswrapper[4781]: I0227 01:15:02.069910 4781 generic.go:334] "Generic (PLEG): container finished" podID="d1f1363d-33b3-4396-b176-66c221518e82" containerID="45c4167eed5398090c993c0790061c50a6dfea1582f588f9a80a8d848992fe77" exitCode=0 Feb 27 01:15:02 crc kubenswrapper[4781]: I0227 01:15:02.070004 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" event={"ID":"d1f1363d-33b3-4396-b176-66c221518e82","Type":"ContainerDied","Data":"45c4167eed5398090c993c0790061c50a6dfea1582f588f9a80a8d848992fe77"} Feb 27 01:15:02 crc kubenswrapper[4781]: I0227 01:15:02.070269 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" event={"ID":"d1f1363d-33b3-4396-b176-66c221518e82","Type":"ContainerStarted","Data":"399cdc1b5315fc83638e6cf2b70c66d1835b56844b28c229f41af4984e90d503"} Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.645846 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.770358 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1f1363d-33b3-4396-b176-66c221518e82-config-volume\") pod \"d1f1363d-33b3-4396-b176-66c221518e82\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.770537 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1f1363d-33b3-4396-b176-66c221518e82-secret-volume\") pod \"d1f1363d-33b3-4396-b176-66c221518e82\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.770571 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4dhc\" (UniqueName: \"kubernetes.io/projected/d1f1363d-33b3-4396-b176-66c221518e82-kube-api-access-s4dhc\") pod \"d1f1363d-33b3-4396-b176-66c221518e82\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.771127 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f1363d-33b3-4396-b176-66c221518e82-config-volume" (OuterVolumeSpecName: "config-volume") pod "d1f1363d-33b3-4396-b176-66c221518e82" (UID: "d1f1363d-33b3-4396-b176-66c221518e82"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.776838 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1f1363d-33b3-4396-b176-66c221518e82-kube-api-access-s4dhc" (OuterVolumeSpecName: "kube-api-access-s4dhc") pod "d1f1363d-33b3-4396-b176-66c221518e82" (UID: "d1f1363d-33b3-4396-b176-66c221518e82"). InnerVolumeSpecName "kube-api-access-s4dhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.776880 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f1363d-33b3-4396-b176-66c221518e82-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d1f1363d-33b3-4396-b176-66c221518e82" (UID: "d1f1363d-33b3-4396-b176-66c221518e82"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.872836 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1f1363d-33b3-4396-b176-66c221518e82-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.872887 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4dhc\" (UniqueName: \"kubernetes.io/projected/d1f1363d-33b3-4396-b176-66c221518e82-kube-api-access-s4dhc\") on node \"crc\" DevicePath \"\"" Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.872901 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1f1363d-33b3-4396-b176-66c221518e82-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:15:04 crc kubenswrapper[4781]: I0227 01:15:04.099193 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" event={"ID":"d1f1363d-33b3-4396-b176-66c221518e82","Type":"ContainerDied","Data":"399cdc1b5315fc83638e6cf2b70c66d1835b56844b28c229f41af4984e90d503"} Feb 27 01:15:04 crc kubenswrapper[4781]: I0227 01:15:04.099235 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="399cdc1b5315fc83638e6cf2b70c66d1835b56844b28c229f41af4984e90d503" Feb 27 01:15:04 crc kubenswrapper[4781]: I0227 01:15:04.099516 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:04 crc kubenswrapper[4781]: I0227 01:15:04.745838 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl"] Feb 27 01:15:04 crc kubenswrapper[4781]: I0227 01:15:04.763311 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl"] Feb 27 01:15:05 crc kubenswrapper[4781]: I0227 01:15:05.320865 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb78ed91-75d4-40d9-9359-da1c3878e145" path="/var/lib/kubelet/pods/eb78ed91-75d4-40d9-9359-da1c3878e145/volumes" Feb 27 01:15:08 crc kubenswrapper[4781]: I0227 01:15:08.311327 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:15:08 crc kubenswrapper[4781]: E0227 01:15:08.312118 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:15:22 crc kubenswrapper[4781]: I0227 01:15:22.309676 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:15:22 crc kubenswrapper[4781]: E0227 01:15:22.310569 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:15:33 crc kubenswrapper[4781]: I0227 01:15:33.310489 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:15:33 crc kubenswrapper[4781]: E0227 01:15:33.311644 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:15:33 crc kubenswrapper[4781]: I0227 01:15:33.318436 4781 scope.go:117] "RemoveContainer" containerID="d91a97b2a127dcb363e0a68bf8507e044d643d2c3b09f879675dfcd44d75afab" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.039766 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vvzsl/must-gather-b97zf"] Feb 27 01:15:38 crc kubenswrapper[4781]: E0227 01:15:38.040891 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f1363d-33b3-4396-b176-66c221518e82" containerName="collect-profiles" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.040909 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f1363d-33b3-4396-b176-66c221518e82" containerName="collect-profiles" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.041153 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f1363d-33b3-4396-b176-66c221518e82" containerName="collect-profiles" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.042465 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.044488 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vvzsl"/"openshift-service-ca.crt" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.044541 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vvzsl"/"default-dockercfg-pccr7" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.060692 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vvzsl"/"kube-root-ca.crt" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.061508 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vvzsl/must-gather-b97zf"] Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.119078 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/03276b70-f5f8-486f-beb1-070a017efc66-must-gather-output\") pod \"must-gather-b97zf\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.119463 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6k9r\" (UniqueName: \"kubernetes.io/projected/03276b70-f5f8-486f-beb1-070a017efc66-kube-api-access-q6k9r\") pod \"must-gather-b97zf\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.221929 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/03276b70-f5f8-486f-beb1-070a017efc66-must-gather-output\") pod \"must-gather-b97zf\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.222015 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6k9r\" (UniqueName: \"kubernetes.io/projected/03276b70-f5f8-486f-beb1-070a017efc66-kube-api-access-q6k9r\") pod \"must-gather-b97zf\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.222471 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/03276b70-f5f8-486f-beb1-070a017efc66-must-gather-output\") pod \"must-gather-b97zf\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.247474 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6k9r\" (UniqueName: \"kubernetes.io/projected/03276b70-f5f8-486f-beb1-070a017efc66-kube-api-access-q6k9r\") pod \"must-gather-b97zf\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.365493 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.952688 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vvzsl/must-gather-b97zf"] Feb 27 01:15:39 crc kubenswrapper[4781]: I0227 01:15:39.471465 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/must-gather-b97zf" event={"ID":"03276b70-f5f8-486f-beb1-070a017efc66","Type":"ContainerStarted","Data":"75213b32388c1fc11d660814864b8f23dbc7e620d603be8385e2ead2c1e70380"} Feb 27 01:15:46 crc kubenswrapper[4781]: I0227 01:15:46.311260 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:15:47 crc kubenswrapper[4781]: I0227 01:15:47.564684 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"93857194fe96d9ea4ad88dce6987b56ca3a1bbc406106d6f82950d6a036e6c83"} Feb 27 01:15:48 crc kubenswrapper[4781]: I0227 01:15:48.575923 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/must-gather-b97zf" event={"ID":"03276b70-f5f8-486f-beb1-070a017efc66","Type":"ContainerStarted","Data":"7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234"} Feb 27 01:15:48 crc kubenswrapper[4781]: I0227 01:15:48.576267 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/must-gather-b97zf" event={"ID":"03276b70-f5f8-486f-beb1-070a017efc66","Type":"ContainerStarted","Data":"8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81"} Feb 27 01:15:48 crc kubenswrapper[4781]: I0227 01:15:48.592948 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vvzsl/must-gather-b97zf" podStartSLOduration=2.197822789 podStartE2EDuration="10.592927406s" podCreationTimestamp="2026-02-27 01:15:38 +0000 UTC" firstStartedPulling="2026-02-27 01:15:38.959964289 +0000 UTC m=+4208.217503843" lastFinishedPulling="2026-02-27 01:15:47.355068906 +0000 UTC m=+4216.612608460" observedRunningTime="2026-02-27 01:15:48.590145442 +0000 UTC m=+4217.847684996" watchObservedRunningTime="2026-02-27 01:15:48.592927406 +0000 UTC m=+4217.850466960" Feb 27 01:15:52 crc kubenswrapper[4781]: I0227 01:15:52.799668 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-ml4c2"] Feb 27 01:15:52 crc kubenswrapper[4781]: I0227 01:15:52.803140 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:52 crc kubenswrapper[4781]: I0227 01:15:52.976125 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5c6w\" (UniqueName: \"kubernetes.io/projected/ec2ecbcb-d11e-4803-80d0-cda5c906849b-kube-api-access-c5c6w\") pod \"crc-debug-ml4c2\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:52 crc kubenswrapper[4781]: I0227 01:15:52.976737 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2ecbcb-d11e-4803-80d0-cda5c906849b-host\") pod \"crc-debug-ml4c2\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:53 crc kubenswrapper[4781]: I0227 01:15:53.078913 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2ecbcb-d11e-4803-80d0-cda5c906849b-host\") pod \"crc-debug-ml4c2\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:53 crc kubenswrapper[4781]: I0227 01:15:53.079194 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2ecbcb-d11e-4803-80d0-cda5c906849b-host\") pod \"crc-debug-ml4c2\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:53 crc kubenswrapper[4781]: I0227 01:15:53.079777 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5c6w\" (UniqueName: \"kubernetes.io/projected/ec2ecbcb-d11e-4803-80d0-cda5c906849b-kube-api-access-c5c6w\") pod \"crc-debug-ml4c2\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:53 crc kubenswrapper[4781]: I0227 01:15:53.478336 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5c6w\" (UniqueName: \"kubernetes.io/projected/ec2ecbcb-d11e-4803-80d0-cda5c906849b-kube-api-access-c5c6w\") pod \"crc-debug-ml4c2\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:53 crc kubenswrapper[4781]: I0227 01:15:53.724570 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:53 crc kubenswrapper[4781]: W0227 01:15:53.762155 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec2ecbcb_d11e_4803_80d0_cda5c906849b.slice/crio-3c19b7b6c89391495311645dc7fda7aa50a853b6c0af7a1674ece3c93ebb3511 WatchSource:0}: Error finding container 3c19b7b6c89391495311645dc7fda7aa50a853b6c0af7a1674ece3c93ebb3511: Status 404 returned error can't find the container with id 3c19b7b6c89391495311645dc7fda7aa50a853b6c0af7a1674ece3c93ebb3511 Feb 27 01:15:54 crc kubenswrapper[4781]: I0227 01:15:54.695888 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" event={"ID":"ec2ecbcb-d11e-4803-80d0-cda5c906849b","Type":"ContainerStarted","Data":"3c19b7b6c89391495311645dc7fda7aa50a853b6c0af7a1674ece3c93ebb3511"} Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.151000 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535916-5rslt"] Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.153228 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.156139 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.156250 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.156268 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.164454 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535916-5rslt"] Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.249804 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rrph\" (UniqueName: \"kubernetes.io/projected/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e-kube-api-access-2rrph\") pod \"auto-csr-approver-29535916-5rslt\" (UID: \"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e\") " pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.352558 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rrph\" (UniqueName: \"kubernetes.io/projected/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e-kube-api-access-2rrph\") pod \"auto-csr-approver-29535916-5rslt\" (UID: \"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e\") " pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.374286 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rrph\" (UniqueName: \"kubernetes.io/projected/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e-kube-api-access-2rrph\") pod \"auto-csr-approver-29535916-5rslt\" (UID: \"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e\") " pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.481157 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:06 crc kubenswrapper[4781]: I0227 01:16:06.884234 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" event={"ID":"ec2ecbcb-d11e-4803-80d0-cda5c906849b","Type":"ContainerStarted","Data":"5429009dce4ed7561680c8a6236f2fd38e0d3ba334a4b82f95acb92d3f8dce94"} Feb 27 01:16:06 crc kubenswrapper[4781]: I0227 01:16:06.907454 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" podStartSLOduration=2.377190795 podStartE2EDuration="14.907433524s" podCreationTimestamp="2026-02-27 01:15:52 +0000 UTC" firstStartedPulling="2026-02-27 01:15:53.765280622 +0000 UTC m=+4223.022820176" lastFinishedPulling="2026-02-27 01:16:06.295523351 +0000 UTC m=+4235.553062905" observedRunningTime="2026-02-27 01:16:06.898559356 +0000 UTC m=+4236.156098910" watchObservedRunningTime="2026-02-27 01:16:06.907433524 +0000 UTC m=+4236.164973078" Feb 27 01:16:06 crc kubenswrapper[4781]: I0227 01:16:06.946603 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535916-5rslt"] Feb 27 01:16:06 crc kubenswrapper[4781]: W0227 01:16:06.947072 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeca34b2_a27c_46b9_bbe3_ac2d08a7a72e.slice/crio-182ecfdc1953621dfcf56b119ca2c33c14ebd6a3b3896b349c359878020790a1 WatchSource:0}: Error finding container 182ecfdc1953621dfcf56b119ca2c33c14ebd6a3b3896b349c359878020790a1: Status 404 returned error can't find the container with id 182ecfdc1953621dfcf56b119ca2c33c14ebd6a3b3896b349c359878020790a1 Feb 27 01:16:07 crc kubenswrapper[4781]: I0227 01:16:07.898459 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535916-5rslt" event={"ID":"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e","Type":"ContainerStarted","Data":"182ecfdc1953621dfcf56b119ca2c33c14ebd6a3b3896b349c359878020790a1"} Feb 27 01:16:08 crc kubenswrapper[4781]: I0227 01:16:08.910139 4781 generic.go:334] "Generic (PLEG): container finished" podID="deca34b2-a27c-46b9-bbe3-ac2d08a7a72e" containerID="a89d93284b5be38596ce103c331565c9dbf5be828da69afb3c56f041c046abb6" exitCode=0 Feb 27 01:16:08 crc kubenswrapper[4781]: I0227 01:16:08.910245 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535916-5rslt" event={"ID":"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e","Type":"ContainerDied","Data":"a89d93284b5be38596ce103c331565c9dbf5be828da69afb3c56f041c046abb6"} Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.579204 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.696303 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rrph\" (UniqueName: \"kubernetes.io/projected/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e-kube-api-access-2rrph\") pod \"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e\" (UID: \"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e\") " Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.703303 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e-kube-api-access-2rrph" (OuterVolumeSpecName: "kube-api-access-2rrph") pod "deca34b2-a27c-46b9-bbe3-ac2d08a7a72e" (UID: "deca34b2-a27c-46b9-bbe3-ac2d08a7a72e"). InnerVolumeSpecName "kube-api-access-2rrph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.799425 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rrph\" (UniqueName: \"kubernetes.io/projected/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e-kube-api-access-2rrph\") on node \"crc\" DevicePath \"\"" Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.935327 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535916-5rslt" event={"ID":"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e","Type":"ContainerDied","Data":"182ecfdc1953621dfcf56b119ca2c33c14ebd6a3b3896b349c359878020790a1"} Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.935708 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="182ecfdc1953621dfcf56b119ca2c33c14ebd6a3b3896b349c359878020790a1" Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.935384 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:11 crc kubenswrapper[4781]: I0227 01:16:11.665241 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535910-zxgrs"] Feb 27 01:16:11 crc kubenswrapper[4781]: I0227 01:16:11.678441 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535910-zxgrs"] Feb 27 01:16:13 crc kubenswrapper[4781]: I0227 01:16:13.324976 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5688def-e560-413e-8be5-1d2cfd7e7b4b" path="/var/lib/kubelet/pods/b5688def-e560-413e-8be5-1d2cfd7e7b4b/volumes" Feb 27 01:16:33 crc kubenswrapper[4781]: I0227 01:16:33.411475 4781 scope.go:117] "RemoveContainer" containerID="2f2421387f96858e89c61569a502259afb51c7ee81cb327e3f4310b20461360e" Feb 27 01:17:04 crc kubenswrapper[4781]: I0227 01:17:04.471015 4781 generic.go:334] "Generic (PLEG): container finished" podID="ec2ecbcb-d11e-4803-80d0-cda5c906849b" containerID="5429009dce4ed7561680c8a6236f2fd38e0d3ba334a4b82f95acb92d3f8dce94" exitCode=0 Feb 27 01:17:04 crc kubenswrapper[4781]: I0227 01:17:04.471105 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" event={"ID":"ec2ecbcb-d11e-4803-80d0-cda5c906849b","Type":"ContainerDied","Data":"5429009dce4ed7561680c8a6236f2fd38e0d3ba334a4b82f95acb92d3f8dce94"} Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.605201 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.641653 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-ml4c2"] Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.669845 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-ml4c2"] Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.777664 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5c6w\" (UniqueName: \"kubernetes.io/projected/ec2ecbcb-d11e-4803-80d0-cda5c906849b-kube-api-access-c5c6w\") pod \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.778785 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2ecbcb-d11e-4803-80d0-cda5c906849b-host\") pod \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.778933 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec2ecbcb-d11e-4803-80d0-cda5c906849b-host" (OuterVolumeSpecName: "host") pod "ec2ecbcb-d11e-4803-80d0-cda5c906849b" (UID: "ec2ecbcb-d11e-4803-80d0-cda5c906849b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.779551 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2ecbcb-d11e-4803-80d0-cda5c906849b-host\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.475857 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec2ecbcb-d11e-4803-80d0-cda5c906849b-kube-api-access-c5c6w" (OuterVolumeSpecName: "kube-api-access-c5c6w") pod "ec2ecbcb-d11e-4803-80d0-cda5c906849b" (UID: "ec2ecbcb-d11e-4803-80d0-cda5c906849b"). InnerVolumeSpecName "kube-api-access-c5c6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.497142 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5c6w\" (UniqueName: \"kubernetes.io/projected/ec2ecbcb-d11e-4803-80d0-cda5c906849b-kube-api-access-c5c6w\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.502716 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c19b7b6c89391495311645dc7fda7aa50a853b6c0af7a1674ece3c93ebb3511" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.502835 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.807691 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-nbfnc"] Feb 27 01:17:06 crc kubenswrapper[4781]: E0227 01:17:06.808147 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deca34b2-a27c-46b9-bbe3-ac2d08a7a72e" containerName="oc" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.808166 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="deca34b2-a27c-46b9-bbe3-ac2d08a7a72e" containerName="oc" Feb 27 01:17:06 crc kubenswrapper[4781]: E0227 01:17:06.808191 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec2ecbcb-d11e-4803-80d0-cda5c906849b" containerName="container-00" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.808202 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec2ecbcb-d11e-4803-80d0-cda5c906849b" containerName="container-00" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.808388 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="deca34b2-a27c-46b9-bbe3-ac2d08a7a72e" containerName="oc" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.808403 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec2ecbcb-d11e-4803-80d0-cda5c906849b" containerName="container-00" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.809133 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.905686 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef5ae2b7-86af-4272-9ec7-767cfa31836a-host\") pod \"crc-debug-nbfnc\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.905824 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zfk7\" (UniqueName: \"kubernetes.io/projected/ef5ae2b7-86af-4272-9ec7-767cfa31836a-kube-api-access-4zfk7\") pod \"crc-debug-nbfnc\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.008684 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef5ae2b7-86af-4272-9ec7-767cfa31836a-host\") pod \"crc-debug-nbfnc\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.008790 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zfk7\" (UniqueName: \"kubernetes.io/projected/ef5ae2b7-86af-4272-9ec7-767cfa31836a-kube-api-access-4zfk7\") pod \"crc-debug-nbfnc\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.008878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef5ae2b7-86af-4272-9ec7-767cfa31836a-host\") pod \"crc-debug-nbfnc\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.030382 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zfk7\" (UniqueName: \"kubernetes.io/projected/ef5ae2b7-86af-4272-9ec7-767cfa31836a-kube-api-access-4zfk7\") pod \"crc-debug-nbfnc\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.126926 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.321189 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec2ecbcb-d11e-4803-80d0-cda5c906849b" path="/var/lib/kubelet/pods/ec2ecbcb-d11e-4803-80d0-cda5c906849b/volumes" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.513604 4781 generic.go:334] "Generic (PLEG): container finished" podID="ef5ae2b7-86af-4272-9ec7-767cfa31836a" containerID="4d92a51020b7596d7680b0bc9dcf1180dddc52a240e9d1a8d518dcb39bbffd84" exitCode=0 Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.513660 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" event={"ID":"ef5ae2b7-86af-4272-9ec7-767cfa31836a","Type":"ContainerDied","Data":"4d92a51020b7596d7680b0bc9dcf1180dddc52a240e9d1a8d518dcb39bbffd84"} Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.513687 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" event={"ID":"ef5ae2b7-86af-4272-9ec7-767cfa31836a","Type":"ContainerStarted","Data":"72c5d34b8aad92c4bd90c23fcd563898e50ffb0bd14489ced455f6756ad964e7"} Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.649930 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.737489 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zfk7\" (UniqueName: \"kubernetes.io/projected/ef5ae2b7-86af-4272-9ec7-767cfa31836a-kube-api-access-4zfk7\") pod \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.738053 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef5ae2b7-86af-4272-9ec7-767cfa31836a-host\") pod \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.738169 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef5ae2b7-86af-4272-9ec7-767cfa31836a-host" (OuterVolumeSpecName: "host") pod "ef5ae2b7-86af-4272-9ec7-767cfa31836a" (UID: "ef5ae2b7-86af-4272-9ec7-767cfa31836a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.738963 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef5ae2b7-86af-4272-9ec7-767cfa31836a-host\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.745610 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5ae2b7-86af-4272-9ec7-767cfa31836a-kube-api-access-4zfk7" (OuterVolumeSpecName: "kube-api-access-4zfk7") pod "ef5ae2b7-86af-4272-9ec7-767cfa31836a" (UID: "ef5ae2b7-86af-4272-9ec7-767cfa31836a"). InnerVolumeSpecName "kube-api-access-4zfk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.840215 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zfk7\" (UniqueName: \"kubernetes.io/projected/ef5ae2b7-86af-4272-9ec7-767cfa31836a-kube-api-access-4zfk7\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.984429 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-nbfnc"] Feb 27 01:17:09 crc kubenswrapper[4781]: I0227 01:17:09.012026 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-nbfnc"] Feb 27 01:17:09 crc kubenswrapper[4781]: I0227 01:17:09.321772 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef5ae2b7-86af-4272-9ec7-767cfa31836a" path="/var/lib/kubelet/pods/ef5ae2b7-86af-4272-9ec7-767cfa31836a/volumes" Feb 27 01:17:09 crc kubenswrapper[4781]: I0227 01:17:09.534380 4781 scope.go:117] "RemoveContainer" containerID="4d92a51020b7596d7680b0bc9dcf1180dddc52a240e9d1a8d518dcb39bbffd84" Feb 27 01:17:09 crc kubenswrapper[4781]: I0227 01:17:09.534427 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.169999 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-8tfrx"] Feb 27 01:17:10 crc kubenswrapper[4781]: E0227 01:17:10.171196 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5ae2b7-86af-4272-9ec7-767cfa31836a" containerName="container-00" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.171214 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5ae2b7-86af-4272-9ec7-767cfa31836a" containerName="container-00" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.171510 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5ae2b7-86af-4272-9ec7-767cfa31836a" containerName="container-00" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.172414 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.273220 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57gj7\" (UniqueName: \"kubernetes.io/projected/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-kube-api-access-57gj7\") pod \"crc-debug-8tfrx\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.273308 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-host\") pod \"crc-debug-8tfrx\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.375354 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57gj7\" (UniqueName: \"kubernetes.io/projected/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-kube-api-access-57gj7\") pod \"crc-debug-8tfrx\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.375437 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-host\") pod \"crc-debug-8tfrx\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.375681 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-host\") pod \"crc-debug-8tfrx\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.397774 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57gj7\" (UniqueName: \"kubernetes.io/projected/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-kube-api-access-57gj7\") pod \"crc-debug-8tfrx\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.494059 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: W0227 01:17:10.543900 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba840bdf_362a_4cad_85e5_3f450bd7f2f5.slice/crio-1506348ce5416450a0d38efaeef159358e8fab0ed16fcde55a882c178f0e269a WatchSource:0}: Error finding container 1506348ce5416450a0d38efaeef159358e8fab0ed16fcde55a882c178f0e269a: Status 404 returned error can't find the container with id 1506348ce5416450a0d38efaeef159358e8fab0ed16fcde55a882c178f0e269a Feb 27 01:17:11 crc kubenswrapper[4781]: I0227 01:17:11.565442 4781 generic.go:334] "Generic (PLEG): container finished" podID="ba840bdf-362a-4cad-85e5-3f450bd7f2f5" containerID="aa2a575779b6b05c095bd940a68ea17e213f83823d4e072ee160e36b9bfd3fea" exitCode=0 Feb 27 01:17:11 crc kubenswrapper[4781]: I0227 01:17:11.566082 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" event={"ID":"ba840bdf-362a-4cad-85e5-3f450bd7f2f5","Type":"ContainerDied","Data":"aa2a575779b6b05c095bd940a68ea17e213f83823d4e072ee160e36b9bfd3fea"} Feb 27 01:17:11 crc kubenswrapper[4781]: I0227 01:17:11.566139 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" event={"ID":"ba840bdf-362a-4cad-85e5-3f450bd7f2f5","Type":"ContainerStarted","Data":"1506348ce5416450a0d38efaeef159358e8fab0ed16fcde55a882c178f0e269a"} Feb 27 01:17:11 crc kubenswrapper[4781]: I0227 01:17:11.612481 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-8tfrx"] Feb 27 01:17:11 crc kubenswrapper[4781]: I0227 01:17:11.622479 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-8tfrx"] Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.687558 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.826845 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-host\") pod \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.826944 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57gj7\" (UniqueName: \"kubernetes.io/projected/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-kube-api-access-57gj7\") pod \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.826956 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-host" (OuterVolumeSpecName: "host") pod "ba840bdf-362a-4cad-85e5-3f450bd7f2f5" (UID: "ba840bdf-362a-4cad-85e5-3f450bd7f2f5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.827521 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-host\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.843868 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-kube-api-access-57gj7" (OuterVolumeSpecName: "kube-api-access-57gj7") pod "ba840bdf-362a-4cad-85e5-3f450bd7f2f5" (UID: "ba840bdf-362a-4cad-85e5-3f450bd7f2f5"). InnerVolumeSpecName "kube-api-access-57gj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.929872 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57gj7\" (UniqueName: \"kubernetes.io/projected/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-kube-api-access-57gj7\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:13 crc kubenswrapper[4781]: I0227 01:17:13.325547 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba840bdf-362a-4cad-85e5-3f450bd7f2f5" path="/var/lib/kubelet/pods/ba840bdf-362a-4cad-85e5-3f450bd7f2f5/volumes" Feb 27 01:17:13 crc kubenswrapper[4781]: I0227 01:17:13.587527 4781 scope.go:117] "RemoveContainer" containerID="aa2a575779b6b05c095bd940a68ea17e213f83823d4e072ee160e36b9bfd3fea" Feb 27 01:17:13 crc kubenswrapper[4781]: I0227 01:17:13.587579 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:41 crc kubenswrapper[4781]: I0227 01:17:41.998243 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_58009056-4183-4017-bfa1-c14ce28b92ea/init-config-reloader/0.log" Feb 27 01:17:42 crc kubenswrapper[4781]: I0227 01:17:42.221799 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_58009056-4183-4017-bfa1-c14ce28b92ea/init-config-reloader/0.log" Feb 27 01:17:42 crc kubenswrapper[4781]: I0227 01:17:42.227154 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_58009056-4183-4017-bfa1-c14ce28b92ea/config-reloader/0.log" Feb 27 01:17:42 crc kubenswrapper[4781]: I0227 01:17:42.249381 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_58009056-4183-4017-bfa1-c14ce28b92ea/alertmanager/0.log" Feb 27 01:17:42 crc kubenswrapper[4781]: I0227 01:17:42.935350 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-9fcdb6594-94vkn_582fee51-d9df-4150-b217-889f2f4f8852/barbican-api/0.log" Feb 27 01:17:42 crc kubenswrapper[4781]: I0227 01:17:42.949212 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6fff4854c8-ttzsm_41039943-96a7-4fe6-8b66-0d64cd12a1fa/barbican-keystone-listener/0.log" Feb 27 01:17:42 crc kubenswrapper[4781]: I0227 01:17:42.981765 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-9fcdb6594-94vkn_582fee51-d9df-4150-b217-889f2f4f8852/barbican-api-log/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.170980 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7dd7c6f4ff-m4d2l_f92df023-2e4a-495e-bbef-4a043c661f46/barbican-worker/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.233912 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6fff4854c8-ttzsm_41039943-96a7-4fe6-8b66-0d64cd12a1fa/barbican-keystone-listener-log/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.272293 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7dd7c6f4ff-m4d2l_f92df023-2e4a-495e-bbef-4a043c661f46/barbican-worker-log/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.503186 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp_94c301c2-f624-44a1-ad01-7d60748c5fca/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.626783 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4f5736d7-ab3f-41d9-b5ec-94da30e708f1/ceilometer-central-agent/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.810489 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4f5736d7-ab3f-41d9-b5ec-94da30e708f1/ceilometer-notification-agent/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.844294 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4f5736d7-ab3f-41d9-b5ec-94da30e708f1/proxy-httpd/0.log" Feb 27 01:17:44 crc kubenswrapper[4781]: I0227 01:17:44.020605 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4f5736d7-ab3f-41d9-b5ec-94da30e708f1/sg-core/0.log" Feb 27 01:17:44 crc kubenswrapper[4781]: I0227 01:17:44.086887 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1cb0bf7e-097c-4c30-b0e6-224090588da2/cinder-api-log/0.log" Feb 27 01:17:44 crc kubenswrapper[4781]: I0227 01:17:44.147783 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1cb0bf7e-097c-4c30-b0e6-224090588da2/cinder-api/0.log" Feb 27 01:17:44 crc kubenswrapper[4781]: I0227 01:17:44.677248 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_16cb4c6c-2ddb-41e0-8db3-f44961445474/probe/0.log" Feb 27 01:17:44 crc kubenswrapper[4781]: I0227 01:17:44.902440 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_16cb4c6c-2ddb-41e0-8db3-f44961445474/cinder-scheduler/0.log" Feb 27 01:17:45 crc kubenswrapper[4781]: I0227 01:17:45.199954 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_a7ad9523-5281-4d1c-a9d5-92982905d525/cloudkitty-api-log/0.log" Feb 27 01:17:45 crc kubenswrapper[4781]: I0227 01:17:45.220956 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_42503ae1-b143-45c3-8789-e2d1f72cc335/loki-compactor/0.log" Feb 27 01:17:45 crc kubenswrapper[4781]: I0227 01:17:45.352493 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_a7ad9523-5281-4d1c-a9d5-92982905d525/cloudkitty-api/0.log" Feb 27 01:17:45 crc kubenswrapper[4781]: I0227 01:17:45.545391 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-585d9bcbc-nqbgf_a5170e93-09e9-40d2-ac65-b87d44ceb185/loki-distributor/0.log" Feb 27 01:17:45 crc kubenswrapper[4781]: I0227 01:17:45.646715 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-bxttl_877c39ec-0202-4987-b6e7-4fb90c4dc9b5/gateway/0.log" Feb 27 01:17:45 crc kubenswrapper[4781]: I0227 01:17:45.735069 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-mj87x_233250c8-3871-43ec-8c1d-47bd1d3133e1/gateway/0.log" Feb 27 01:17:46 crc kubenswrapper[4781]: I0227 01:17:46.124950 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_2691e066-2f4c-4e7e-bcac-01933bd6cadb/loki-ingester/0.log" Feb 27 01:17:46 crc kubenswrapper[4781]: I0227 01:17:46.167250 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_684ccdab-ae41-466c-bf47-78c3ada41164/loki-index-gateway/0.log" Feb 27 01:17:46 crc kubenswrapper[4781]: I0227 01:17:46.400269 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f_d9e3acc2-cee4-4bfe-af04-3a64041fc327/loki-query-frontend/0.log" Feb 27 01:17:46 crc kubenswrapper[4781]: I0227 01:17:46.814347 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg_95533111-b2e6-41c2-b7b8-edc0a82e2ca5/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.092266 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-9drr8_f01f0f26-7e7a-464f-8f50-4d49bf87cb46/init/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.135189 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-qtql5_b05a1d9c-7887-4173-99fe-97f7c89cc555/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.303392 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-9drr8_f01f0f26-7e7a-464f-8f50-4d49bf87cb46/init/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.498454 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-58c84b5844-whkj4_d71cee9c-2288-4843-ab71-0720c8527073/loki-querier/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.530944 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-9drr8_f01f0f26-7e7a-464f-8f50-4d49bf87cb46/dnsmasq-dns/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.818377 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs_756e2fbc-556d-44b8-8820-e469ae73ff3b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.997260 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_409aba7a-466d-40a0-b9bd-7dfd8d81ee4f/glance-log/0.log" Feb 27 01:17:48 crc kubenswrapper[4781]: I0227 01:17:48.017933 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_409aba7a-466d-40a0-b9bd-7dfd8d81ee4f/glance-httpd/0.log" Feb 27 01:17:48 crc kubenswrapper[4781]: I0227 01:17:48.207023 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_141465f3-d299-4d9c-a74f-0df5c741e325/glance-httpd/0.log" Feb 27 01:17:48 crc kubenswrapper[4781]: I0227 01:17:48.262112 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_141465f3-d299-4d9c-a74f-0df5c741e325/glance-log/0.log" Feb 27 01:17:48 crc kubenswrapper[4781]: I0227 01:17:48.531404 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-fx894_0dace61f-2e30-4132-9ce6-1cb1c8a6cedc/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:48 crc kubenswrapper[4781]: I0227 01:17:48.663573 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-rrxrj_29e8157f-b610-48f3-93ac-9173fa6d484a/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:48 crc kubenswrapper[4781]: I0227 01:17:48.929747 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29535901-2chr7_8f6a0640-2204-47a2-a550-7a7bb14ebc0d/keystone-cron/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.198825 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_25933928-b136-4b38-955a-46a3d802a62b/kube-state-metrics/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.212731 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-56459cf68c-4q7c8_2467458a-476f-460f-a6ce-144d7304476d/keystone-api/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.312249 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_cf4c3569-6860-4c2a-8923-42e436279a11/cloudkitty-proc/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.414247 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c_bd292468-b151-4004-b0b7-bd873e7e4e2d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.692788 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56f5d76fc7-rbhdd_384db6f0-71f1-4926-9e65-5c27eb430325/neutron-api/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.783037 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56f5d76fc7-rbhdd_384db6f0-71f1-4926-9e65-5c27eb430325/neutron-httpd/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.977602 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq_3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:50 crc kubenswrapper[4781]: I0227 01:17:50.424495 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4e258c11-5caa-4d6b-ab77-841ddf83ac81/nova-api-log/0.log" Feb 27 01:17:50 crc kubenswrapper[4781]: I0227 01:17:50.465297 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_7503d0a7-eca6-4d15-9538-9cded970acc2/nova-cell0-conductor-conductor/0.log" Feb 27 01:17:50 crc kubenswrapper[4781]: I0227 01:17:50.766792 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4e258c11-5caa-4d6b-ab77-841ddf83ac81/nova-api-api/0.log" Feb 27 01:17:51 crc kubenswrapper[4781]: I0227 01:17:51.229202 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c8c40a18-7bbd-4d06-8a8a-427de95016fa/nova-cell1-conductor-conductor/0.log" Feb 27 01:17:51 crc kubenswrapper[4781]: I0227 01:17:51.258941 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a3b399a8-7654-47f3-be04-759080f4f180/nova-cell1-novncproxy-novncproxy/0.log" Feb 27 01:17:51 crc kubenswrapper[4781]: I0227 01:17:51.403947 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-ntt4h_d3f8abc3-17b4-4d88-890e-85304a100a97/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:51 crc kubenswrapper[4781]: I0227 01:17:51.585111 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e32c3573-4acb-4d70-aa6e-2d647c108931/nova-metadata-log/0.log" Feb 27 01:17:51 crc kubenswrapper[4781]: I0227 01:17:51.841026 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_1d7f8c00-d318-4f7d-b67e-6743c3a82dae/nova-scheduler-scheduler/0.log" Feb 27 01:17:51 crc kubenswrapper[4781]: I0227 01:17:51.911112 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_22624edd-e366-4aff-84dd-c3cec89c0591/mysql-bootstrap/0.log" Feb 27 01:17:52 crc kubenswrapper[4781]: I0227 01:17:52.167940 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_22624edd-e366-4aff-84dd-c3cec89c0591/mysql-bootstrap/0.log" Feb 27 01:17:52 crc kubenswrapper[4781]: I0227 01:17:52.225613 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_22624edd-e366-4aff-84dd-c3cec89c0591/galera/0.log" Feb 27 01:17:52 crc kubenswrapper[4781]: I0227 01:17:52.962953 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d59d3864-af0d-407c-8431-ae2e17e4b46f/mysql-bootstrap/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.026550 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e32c3573-4acb-4d70-aa6e-2d647c108931/nova-metadata-metadata/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.265200 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d59d3864-af0d-407c-8431-ae2e17e4b46f/galera/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.281263 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d59d3864-af0d-407c-8431-ae2e17e4b46f/mysql-bootstrap/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.395932 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_02c4875e-e180-4365-a00a-828ab5d95c34/openstackclient/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.561772 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9zkpb_092921e0-a033-4021-b0f5-9c89de3aa830/ovn-controller/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.695040 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hx85z_cf463d95-25dd-4b99-afb0-dac99157c5fa/openstack-network-exporter/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.879682 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hcb9s_9c2c498e-52b1-4ee2-bcf8-3599ee89513c/ovsdb-server-init/0.log" Feb 27 01:17:54 crc kubenswrapper[4781]: I0227 01:17:54.096008 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hcb9s_9c2c498e-52b1-4ee2-bcf8-3599ee89513c/ovsdb-server-init/0.log" Feb 27 01:17:54 crc kubenswrapper[4781]: I0227 01:17:54.162189 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hcb9s_9c2c498e-52b1-4ee2-bcf8-3599ee89513c/ovs-vswitchd/0.log" Feb 27 01:17:54 crc kubenswrapper[4781]: I0227 01:17:54.165957 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hcb9s_9c2c498e-52b1-4ee2-bcf8-3599ee89513c/ovsdb-server/0.log" Feb 27 01:17:54 crc kubenswrapper[4781]: I0227 01:17:54.477900 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-27lcw_e61bcd0e-2490-4f8e-a429-cf07405dc01b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:54 crc kubenswrapper[4781]: I0227 01:17:54.488375 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d5923572-3637-49e3-9eea-72e52c5fb88b/openstack-network-exporter/0.log" Feb 27 01:17:54 crc kubenswrapper[4781]: I0227 01:17:54.557165 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d5923572-3637-49e3-9eea-72e52c5fb88b/ovn-northd/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.078522 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bd103c67-d035-4de1-aba9-667d1eb67813/openstack-network-exporter/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.127538 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bd103c67-d035-4de1-aba9-667d1eb67813/ovsdbserver-nb/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.278165 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7d499c77-ccba-41d1-9efb-8424fc7e8d0e/openstack-network-exporter/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.369375 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7d499c77-ccba-41d1-9efb-8424fc7e8d0e/ovsdbserver-sb/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.496381 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d64c6bb46-jcp5p_5ff35aa7-7e5a-4069-8dc4-392e01a957e3/placement-api/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.637162 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d64c6bb46-jcp5p_5ff35aa7-7e5a-4069-8dc4-392e01a957e3/placement-log/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.664976 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f/init-config-reloader/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.058391 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f/config-reloader/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.097085 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f/init-config-reloader/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.102572 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f/prometheus/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.103880 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f/thanos-sidecar/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.333656 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_37519387-1738-4500-9953-52deba3e4a85/setup-container/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.575180 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_37519387-1738-4500-9953-52deba3e4a85/rabbitmq/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.649860 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ed38e2f2-b350-4abd-abe2-859c9d504aa8/setup-container/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.663550 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_37519387-1738-4500-9953-52deba3e4a85/setup-container/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.052451 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ed38e2f2-b350-4abd-abe2-859c9d504aa8/rabbitmq/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.111311 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz_98c901e2-eff5-4256-9add-25d09beb51e3/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.131425 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ed38e2f2-b350-4abd-abe2-859c9d504aa8/setup-container/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.353003 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-4tds4_ca27d369-00b1-47ec-88cc-87d4a7065356/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.355578 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt_05795337-1929-47d6-b63f-96d078b66c47/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.782177 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-n6nts_2a7f1888-0c26-47e0-91b4-fbf07824cab4/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.872272 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vvrmt_35b9cf19-a1cd-48b5-9072-d5c71680c892/ssh-known-hosts-edpm-deployment/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.308058 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c945d84cf-z5v9s_e8ba5117-540f-448d-aac6-6fde482f5f14/proxy-server/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.428891 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-6n9rn_b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b/swift-ring-rebalance/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.452033 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c945d84cf-z5v9s_e8ba5117-540f-448d-aac6-6fde482f5f14/proxy-httpd/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.571872 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/account-auditor/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.628254 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/account-reaper/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.700432 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/account-replicator/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.761392 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/account-server/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.806207 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/container-auditor/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.907042 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/container-replicator/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.981300 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/container-server/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.992343 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/container-updater/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.157009 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/object-auditor/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.182594 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/object-expirer/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.286123 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/object-replicator/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.490869 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/object-server/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.504330 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/rsync/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.519718 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/object-updater/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.627450 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/swift-recon-cron/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.854782 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs_7a6c3903-7dfd-49cd-a92f-d138e10db404/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.904835 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_2cc23bf5-7773-4d33-b2be-2ee2a807f086/tempest-tests-tempest-tests-runner/0.log" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.018549 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_083b0010-19f4-4944-a097-96d20dad7eda/test-operator-logs-container/0.log" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.111939 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j_9f7ced88-662a-42f0-8385-97292a7f4ce4/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.165703 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535918-hlgxs"] Feb 27 01:18:00 crc kubenswrapper[4781]: E0227 01:18:00.166213 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba840bdf-362a-4cad-85e5-3f450bd7f2f5" containerName="container-00" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.166239 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba840bdf-362a-4cad-85e5-3f450bd7f2f5" containerName="container-00" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.166487 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba840bdf-362a-4cad-85e5-3f450bd7f2f5" containerName="container-00" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.167454 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.173247 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535918-hlgxs"] Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.175678 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.175892 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.176158 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.287488 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz84t\" (UniqueName: \"kubernetes.io/projected/93462151-bfc8-4c6a-8d83-adc55e0b038c-kube-api-access-hz84t\") pod \"auto-csr-approver-29535918-hlgxs\" (UID: \"93462151-bfc8-4c6a-8d83-adc55e0b038c\") " pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.389517 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz84t\" (UniqueName: \"kubernetes.io/projected/93462151-bfc8-4c6a-8d83-adc55e0b038c-kube-api-access-hz84t\") pod \"auto-csr-approver-29535918-hlgxs\" (UID: \"93462151-bfc8-4c6a-8d83-adc55e0b038c\") " pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.414588 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz84t\" (UniqueName: \"kubernetes.io/projected/93462151-bfc8-4c6a-8d83-adc55e0b038c-kube-api-access-hz84t\") pod \"auto-csr-approver-29535918-hlgxs\" (UID: \"93462151-bfc8-4c6a-8d83-adc55e0b038c\") " pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.507131 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:01 crc kubenswrapper[4781]: I0227 01:18:01.044175 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535918-hlgxs"] Feb 27 01:18:01 crc kubenswrapper[4781]: I0227 01:18:01.057290 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:18:02 crc kubenswrapper[4781]: I0227 01:18:02.075702 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" event={"ID":"93462151-bfc8-4c6a-8d83-adc55e0b038c","Type":"ContainerStarted","Data":"79ffb766c3e5304283b457d328e97935eaab5825ccd66c5d31295439b77ab474"} Feb 27 01:18:03 crc kubenswrapper[4781]: I0227 01:18:03.086648 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" event={"ID":"93462151-bfc8-4c6a-8d83-adc55e0b038c","Type":"ContainerStarted","Data":"500185c8a41f1ea03fad4eed8ceeb62b2a655600fefd254d6835b485744f3e8b"} Feb 27 01:18:03 crc kubenswrapper[4781]: I0227 01:18:03.107520 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" podStartSLOduration=1.898426119 podStartE2EDuration="3.10749825s" podCreationTimestamp="2026-02-27 01:18:00 +0000 UTC" firstStartedPulling="2026-02-27 01:18:01.056947753 +0000 UTC m=+4350.314487307" lastFinishedPulling="2026-02-27 01:18:02.266019884 +0000 UTC m=+4351.523559438" observedRunningTime="2026-02-27 01:18:03.100339858 +0000 UTC m=+4352.357879422" watchObservedRunningTime="2026-02-27 01:18:03.10749825 +0000 UTC m=+4352.365037814" Feb 27 01:18:04 crc kubenswrapper[4781]: I0227 01:18:04.115995 4781 generic.go:334] "Generic (PLEG): container finished" podID="93462151-bfc8-4c6a-8d83-adc55e0b038c" containerID="500185c8a41f1ea03fad4eed8ceeb62b2a655600fefd254d6835b485744f3e8b" exitCode=0 Feb 27 01:18:04 crc kubenswrapper[4781]: I0227 01:18:04.116356 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" event={"ID":"93462151-bfc8-4c6a-8d83-adc55e0b038c","Type":"ContainerDied","Data":"500185c8a41f1ea03fad4eed8ceeb62b2a655600fefd254d6835b485744f3e8b"} Feb 27 01:18:04 crc kubenswrapper[4781]: I0227 01:18:04.853119 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_06e98c4a-d812-4e42-b95c-d263e49bf5d3/memcached/0.log" Feb 27 01:18:05 crc kubenswrapper[4781]: I0227 01:18:05.714666 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:05 crc kubenswrapper[4781]: I0227 01:18:05.831470 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz84t\" (UniqueName: \"kubernetes.io/projected/93462151-bfc8-4c6a-8d83-adc55e0b038c-kube-api-access-hz84t\") pod \"93462151-bfc8-4c6a-8d83-adc55e0b038c\" (UID: \"93462151-bfc8-4c6a-8d83-adc55e0b038c\") " Feb 27 01:18:05 crc kubenswrapper[4781]: I0227 01:18:05.838693 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93462151-bfc8-4c6a-8d83-adc55e0b038c-kube-api-access-hz84t" (OuterVolumeSpecName: "kube-api-access-hz84t") pod "93462151-bfc8-4c6a-8d83-adc55e0b038c" (UID: "93462151-bfc8-4c6a-8d83-adc55e0b038c"). InnerVolumeSpecName "kube-api-access-hz84t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:18:05 crc kubenswrapper[4781]: I0227 01:18:05.933872 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz84t\" (UniqueName: \"kubernetes.io/projected/93462151-bfc8-4c6a-8d83-adc55e0b038c-kube-api-access-hz84t\") on node \"crc\" DevicePath \"\"" Feb 27 01:18:06 crc kubenswrapper[4781]: I0227 01:18:06.137192 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" event={"ID":"93462151-bfc8-4c6a-8d83-adc55e0b038c","Type":"ContainerDied","Data":"79ffb766c3e5304283b457d328e97935eaab5825ccd66c5d31295439b77ab474"} Feb 27 01:18:06 crc kubenswrapper[4781]: I0227 01:18:06.137227 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79ffb766c3e5304283b457d328e97935eaab5825ccd66c5d31295439b77ab474" Feb 27 01:18:06 crc kubenswrapper[4781]: I0227 01:18:06.137612 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:06 crc kubenswrapper[4781]: I0227 01:18:06.187152 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535912-gtvv9"] Feb 27 01:18:06 crc kubenswrapper[4781]: I0227 01:18:06.197321 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535912-gtvv9"] Feb 27 01:18:07 crc kubenswrapper[4781]: I0227 01:18:07.322086 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d6c94d-6b4e-4d64-8c67-eb43c03187c2" path="/var/lib/kubelet/pods/95d6c94d-6b4e-4d64-8c67-eb43c03187c2/volumes" Feb 27 01:18:12 crc kubenswrapper[4781]: I0227 01:18:12.895650 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:18:12 crc kubenswrapper[4781]: I0227 01:18:12.896271 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.390264 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/util/0.log" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.708378 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/util/0.log" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.728358 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/pull/0.log" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.769244 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/pull/0.log" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.917957 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/extract/0.log" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.944859 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/util/0.log" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.961363 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/pull/0.log" Feb 27 01:18:31 crc kubenswrapper[4781]: I0227 01:18:31.355659 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-rn44b_bd77d7fe-85fb-4b16-aa12-75359b52e139/manager/0.log" Feb 27 01:18:31 crc kubenswrapper[4781]: I0227 01:18:31.820670 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-4gl88_66c995b3-f763-455e-8ea3-7dfdfb4c4301/manager/0.log" Feb 27 01:18:32 crc kubenswrapper[4781]: I0227 01:18:32.015296 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-nfzvw_6739bbb3-bf62-4b1d-8dd7-3accde691e66/manager/0.log" Feb 27 01:18:32 crc kubenswrapper[4781]: I0227 01:18:32.327007 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-fmbwz_c1807c06-6c68-477c-8725-5702e2d59c93/manager/0.log" Feb 27 01:18:32 crc kubenswrapper[4781]: I0227 01:18:32.917841 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-szs2w_513da4ed-be63-45dd-a32a-27ac3ef443a5/manager/0.log" Feb 27 01:18:33 crc kubenswrapper[4781]: I0227 01:18:33.033156 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-vhmbb_771a50fd-33f6-47ba-ac4a-46da5446cdd8/manager/0.log" Feb 27 01:18:33 crc kubenswrapper[4781]: I0227 01:18:33.444425 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-2pgf6_057d4c8d-606e-44ea-89ea-fb17b4d63733/manager/0.log" Feb 27 01:18:33 crc kubenswrapper[4781]: I0227 01:18:33.600922 4781 scope.go:117] "RemoveContainer" containerID="8f787ca4f347bb157c6f5d9ee468bbb739868634c8f4daa10b685f41a5344282" Feb 27 01:18:33 crc kubenswrapper[4781]: I0227 01:18:33.603043 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-jnhdb_a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad/manager/0.log" Feb 27 01:18:33 crc kubenswrapper[4781]: I0227 01:18:33.716393 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-fb2wf_e4d59c4e-1fd2-43d9-8ac2-d162e746e758/manager/0.log" Feb 27 01:18:33 crc kubenswrapper[4781]: I0227 01:18:33.901550 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-w5wp5_f777df4b-1040-4f86-a816-ea778b9e5dc3/manager/0.log" Feb 27 01:18:34 crc kubenswrapper[4781]: I0227 01:18:34.166449 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-v5hwb_fe25346c-5f31-478e-a639-060c5958b1eb/manager/0.log" Feb 27 01:18:35 crc kubenswrapper[4781]: I0227 01:18:35.082007 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-trb7t_e9a3b900-688c-4043-b1ff-53ae1c3ee1d6/manager/0.log" Feb 27 01:18:35 crc kubenswrapper[4781]: I0227 01:18:35.208146 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-tb298_7d5e1e13-5ce4-48ba-a8c9-3db924e63840/manager/0.log" Feb 27 01:18:35 crc kubenswrapper[4781]: I0227 01:18:35.379468 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4_83466be2-d230-4516-b594-ee56aae3c510/manager/0.log" Feb 27 01:18:35 crc kubenswrapper[4781]: I0227 01:18:35.721828 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rrx6z_f66c974d-5687-42bd-9742-469922240fd5/registry-server/0.log" Feb 27 01:18:35 crc kubenswrapper[4781]: I0227 01:18:35.733369 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-85cf9d4d7d-cl7rb_837579c4-87be-4ce8-94ff-bf25307562db/operator/0.log" Feb 27 01:18:36 crc kubenswrapper[4781]: I0227 01:18:36.079205 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-bvdd5_fae0f5f8-e721-4ef1-9c8f-4574f156913f/manager/0.log" Feb 27 01:18:36 crc kubenswrapper[4781]: I0227 01:18:36.198587 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-rn2vt_9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1/manager/0.log" Feb 27 01:18:36 crc kubenswrapper[4781]: I0227 01:18:36.450051 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-7gr7g_6d15395c-5ed9-43c8-b7f6-ac16e6e32e70/operator/0.log" Feb 27 01:18:37 crc kubenswrapper[4781]: I0227 01:18:37.048300 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-5mgl8_3747ddf8-799c-441c-bd9d-4450bdb72382/manager/0.log" Feb 27 01:18:37 crc kubenswrapper[4781]: I0227 01:18:37.234555 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-dc7k2_cf1fe81a-282d-4e51-b8d9-d6569a640985/manager/0.log" Feb 27 01:18:37 crc kubenswrapper[4781]: I0227 01:18:37.533104 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-gs62l_d31610db-32c1-4c99-9001-ab4504649a75/manager/0.log" Feb 27 01:18:37 crc kubenswrapper[4781]: I0227 01:18:37.600134 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7677fd857d-kxknf_9fe881c2-cb59-41ce-a23c-f2dcba86d9c3/manager/0.log" Feb 27 01:18:37 crc kubenswrapper[4781]: I0227 01:18:37.699506 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-9d678b567-gttml_11361a5e-18c5-448a-8b07-8f5e3245f607/manager/0.log" Feb 27 01:18:42 crc kubenswrapper[4781]: I0227 01:18:42.617033 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-rfwpm_fe1f6a92-751f-417e-b2ff-694c10210db7/manager/0.log" Feb 27 01:18:42 crc kubenswrapper[4781]: I0227 01:18:42.895268 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:18:42 crc kubenswrapper[4781]: I0227 01:18:42.895591 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:19:00 crc kubenswrapper[4781]: I0227 01:19:00.898898 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8d9mv_010c6a41-8e2d-4391-ac1b-82814dad98a4/control-plane-machine-set-operator/0.log" Feb 27 01:19:01 crc kubenswrapper[4781]: I0227 01:19:01.107957 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-29z97_77c54f3f-bdb8-42ff-a466-3bfb1e2d9464/kube-rbac-proxy/0.log" Feb 27 01:19:01 crc kubenswrapper[4781]: I0227 01:19:01.142454 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-29z97_77c54f3f-bdb8-42ff-a466-3bfb1e2d9464/machine-api-operator/0.log" Feb 27 01:19:12 crc kubenswrapper[4781]: I0227 01:19:12.895401 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:19:12 crc kubenswrapper[4781]: I0227 01:19:12.897384 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:19:12 crc kubenswrapper[4781]: I0227 01:19:12.897538 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 01:19:12 crc kubenswrapper[4781]: I0227 01:19:12.898515 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93857194fe96d9ea4ad88dce6987b56ca3a1bbc406106d6f82950d6a036e6c83"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:19:12 crc kubenswrapper[4781]: I0227 01:19:12.898672 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://93857194fe96d9ea4ad88dce6987b56ca3a1bbc406106d6f82950d6a036e6c83" gracePeriod=600 Feb 27 01:19:13 crc kubenswrapper[4781]: I0227 01:19:13.775331 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="93857194fe96d9ea4ad88dce6987b56ca3a1bbc406106d6f82950d6a036e6c83" exitCode=0 Feb 27 01:19:13 crc kubenswrapper[4781]: I0227 01:19:13.775968 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"93857194fe96d9ea4ad88dce6987b56ca3a1bbc406106d6f82950d6a036e6c83"} Feb 27 01:19:13 crc kubenswrapper[4781]: I0227 01:19:13.776003 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f"} Feb 27 01:19:13 crc kubenswrapper[4781]: I0227 01:19:13.776025 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:19:15 crc kubenswrapper[4781]: I0227 01:19:15.224260 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-mwpvm_749ed3fc-65b7-4674-a1b1-0433692d2d89/cert-manager-controller/0.log" Feb 27 01:19:15 crc kubenswrapper[4781]: I0227 01:19:15.460727 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-rwwkv_b732ab89-7ea1-4378-9511-229ee7fa787f/cert-manager-webhook/0.log" Feb 27 01:19:15 crc kubenswrapper[4781]: I0227 01:19:15.486687 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-95z7d_af9e6ffa-5ea0-473d-9e75-a2715093490f/cert-manager-cainjector/0.log" Feb 27 01:19:29 crc kubenswrapper[4781]: I0227 01:19:29.012785 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-5dzvp_fcd8e350-64e3-4a25-9bc5-cce4888da20a/nmstate-console-plugin/0.log" Feb 27 01:19:29 crc kubenswrapper[4781]: I0227 01:19:29.198748 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-r6fjq_f7bf5593-bd4f-462d-bcbf-319b075a5116/nmstate-handler/0.log" Feb 27 01:19:29 crc kubenswrapper[4781]: I0227 01:19:29.336722 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-4d4ds_2b001223-04cf-4a45-843b-e62c5d13ac14/nmstate-metrics/0.log" Feb 27 01:19:29 crc kubenswrapper[4781]: I0227 01:19:29.367917 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-4d4ds_2b001223-04cf-4a45-843b-e62c5d13ac14/kube-rbac-proxy/0.log" Feb 27 01:19:29 crc kubenswrapper[4781]: I0227 01:19:29.479519 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-m8kqs_e948619f-a0f4-4463-9076-e593529e4264/nmstate-operator/0.log" Feb 27 01:19:29 crc kubenswrapper[4781]: I0227 01:19:29.591324 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-vrv7p_677ca1f7-513f-4de1-b64b-66b2524b82a1/nmstate-webhook/0.log" Feb 27 01:19:44 crc kubenswrapper[4781]: I0227 01:19:44.956149 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6c4cf64b95-qzbxj_1fed4c33-9f3f-486b-8f74-f2d9a09b92be/kube-rbac-proxy/0.log" Feb 27 01:19:45 crc kubenswrapper[4781]: I0227 01:19:45.234196 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6c4cf64b95-qzbxj_1fed4c33-9f3f-486b-8f74-f2d9a09b92be/manager/0.log" Feb 27 01:19:59 crc kubenswrapper[4781]: I0227 01:19:59.958160 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-rbdmr_c62f5f48-b15f-4d70-837c-a05addc48839/prometheus-operator/0.log" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.154834 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535920-hp92r"] Feb 27 01:20:00 crc kubenswrapper[4781]: E0227 01:20:00.155307 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93462151-bfc8-4c6a-8d83-adc55e0b038c" containerName="oc" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.155326 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="93462151-bfc8-4c6a-8d83-adc55e0b038c" containerName="oc" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.155583 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="93462151-bfc8-4c6a-8d83-adc55e0b038c" containerName="oc" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.156497 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.165984 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535920-hp92r"] Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.166505 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.167171 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.167386 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.232595 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db/prometheus-operator-admission-webhook/0.log" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.314277 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgfr7\" (UniqueName: \"kubernetes.io/projected/960f5179-f532-4fbf-90fa-e19414cbe684-kube-api-access-hgfr7\") pod \"auto-csr-approver-29535920-hp92r\" (UID: \"960f5179-f532-4fbf-90fa-e19414cbe684\") " pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.317005 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_cbb658fa-808d-4c87-b81e-63863f31382f/prometheus-operator-admission-webhook/0.log" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.417792 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgfr7\" (UniqueName: \"kubernetes.io/projected/960f5179-f532-4fbf-90fa-e19414cbe684-kube-api-access-hgfr7\") pod \"auto-csr-approver-29535920-hp92r\" (UID: \"960f5179-f532-4fbf-90fa-e19414cbe684\") " pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.454597 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgfr7\" (UniqueName: \"kubernetes.io/projected/960f5179-f532-4fbf-90fa-e19414cbe684-kube-api-access-hgfr7\") pod \"auto-csr-approver-29535920-hp92r\" (UID: \"960f5179-f532-4fbf-90fa-e19414cbe684\") " pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.477806 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.485246 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-m6jxs_3fe8e5f0-6c7b-42bd-9604-85a90477d143/operator/0.log" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.506564 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-l5ppf_1a3a6a15-797e-4cfe-8e21-3a813460012d/perses-operator/0.log" Feb 27 01:20:01 crc kubenswrapper[4781]: I0227 01:20:01.125927 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535920-hp92r"] Feb 27 01:20:01 crc kubenswrapper[4781]: I0227 01:20:01.239788 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535920-hp92r" event={"ID":"960f5179-f532-4fbf-90fa-e19414cbe684","Type":"ContainerStarted","Data":"3af6dbf2db555b4c7a2d07f889de88b28f2d2148bb4b56e58f4193e38f778133"} Feb 27 01:20:03 crc kubenswrapper[4781]: I0227 01:20:03.268882 4781 generic.go:334] "Generic (PLEG): container finished" podID="960f5179-f532-4fbf-90fa-e19414cbe684" containerID="7e0241ade9afef50720d7328e1e27817a80d46d9126df5c01d0b6695a2c96b4c" exitCode=0 Feb 27 01:20:03 crc kubenswrapper[4781]: I0227 01:20:03.268941 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535920-hp92r" event={"ID":"960f5179-f532-4fbf-90fa-e19414cbe684","Type":"ContainerDied","Data":"7e0241ade9afef50720d7328e1e27817a80d46d9126df5c01d0b6695a2c96b4c"} Feb 27 01:20:04 crc kubenswrapper[4781]: I0227 01:20:04.895326 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:05 crc kubenswrapper[4781]: I0227 01:20:05.025276 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgfr7\" (UniqueName: \"kubernetes.io/projected/960f5179-f532-4fbf-90fa-e19414cbe684-kube-api-access-hgfr7\") pod \"960f5179-f532-4fbf-90fa-e19414cbe684\" (UID: \"960f5179-f532-4fbf-90fa-e19414cbe684\") " Feb 27 01:20:05 crc kubenswrapper[4781]: I0227 01:20:05.034961 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/960f5179-f532-4fbf-90fa-e19414cbe684-kube-api-access-hgfr7" (OuterVolumeSpecName: "kube-api-access-hgfr7") pod "960f5179-f532-4fbf-90fa-e19414cbe684" (UID: "960f5179-f532-4fbf-90fa-e19414cbe684"). InnerVolumeSpecName "kube-api-access-hgfr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:20:05 crc kubenswrapper[4781]: I0227 01:20:05.129104 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgfr7\" (UniqueName: \"kubernetes.io/projected/960f5179-f532-4fbf-90fa-e19414cbe684-kube-api-access-hgfr7\") on node \"crc\" DevicePath \"\"" Feb 27 01:20:05 crc kubenswrapper[4781]: I0227 01:20:05.290830 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535920-hp92r" event={"ID":"960f5179-f532-4fbf-90fa-e19414cbe684","Type":"ContainerDied","Data":"3af6dbf2db555b4c7a2d07f889de88b28f2d2148bb4b56e58f4193e38f778133"} Feb 27 01:20:05 crc kubenswrapper[4781]: I0227 01:20:05.290869 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3af6dbf2db555b4c7a2d07f889de88b28f2d2148bb4b56e58f4193e38f778133" Feb 27 01:20:05 crc kubenswrapper[4781]: I0227 01:20:05.290877 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:06 crc kubenswrapper[4781]: I0227 01:20:06.013669 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535914-n4hsg"] Feb 27 01:20:06 crc kubenswrapper[4781]: I0227 01:20:06.022581 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535914-n4hsg"] Feb 27 01:20:07 crc kubenswrapper[4781]: I0227 01:20:07.322614 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2afc6c2c-4602-4819-bb62-46008ced90dc" path="/var/lib/kubelet/pods/2afc6c2c-4602-4819-bb62-46008ced90dc/volumes" Feb 27 01:20:17 crc kubenswrapper[4781]: I0227 01:20:17.715308 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-c6m2v_dc6f679c-913d-4851-b69d-a2e26ebf450a/kube-rbac-proxy/0.log" Feb 27 01:20:17 crc kubenswrapper[4781]: I0227 01:20:17.798305 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-c6m2v_dc6f679c-913d-4851-b69d-a2e26ebf450a/controller/0.log" Feb 27 01:20:17 crc kubenswrapper[4781]: I0227 01:20:17.915597 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-frr-files/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.106524 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-frr-files/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.124065 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-reloader/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.125981 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-metrics/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.188361 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-reloader/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.430018 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-metrics/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.444555 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-frr-files/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.489404 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-reloader/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.511788 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-metrics/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.670608 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-reloader/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.729743 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-metrics/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.770211 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-frr-files/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.774942 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/controller/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.953024 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/frr-metrics/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.964451 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/kube-rbac-proxy/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.990203 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/kube-rbac-proxy-frr/0.log" Feb 27 01:20:19 crc kubenswrapper[4781]: I0227 01:20:19.301858 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/reloader/0.log" Feb 27 01:20:19 crc kubenswrapper[4781]: I0227 01:20:19.463421 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-cqkgx_31409f77-5542-4376-8d77-c7a018b245b7/frr-k8s-webhook-server/0.log" Feb 27 01:20:19 crc kubenswrapper[4781]: I0227 01:20:19.660297 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7586d66d7b-59ntk_7020f39f-9738-4625-bd18-e5e4e64f5956/manager/0.log" Feb 27 01:20:19 crc kubenswrapper[4781]: I0227 01:20:19.906694 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-58cbff967-5sp8v_fc2d6f99-bd3f-44e8-91fc-6865285089e7/webhook-server/0.log" Feb 27 01:20:20 crc kubenswrapper[4781]: I0227 01:20:20.155809 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tljmv_5d7e20ea-c069-4c29-9c3f-1ac3404f026c/kube-rbac-proxy/0.log" Feb 27 01:20:20 crc kubenswrapper[4781]: I0227 01:20:20.664728 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tljmv_5d7e20ea-c069-4c29-9c3f-1ac3404f026c/speaker/0.log" Feb 27 01:20:20 crc kubenswrapper[4781]: I0227 01:20:20.739212 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/frr/0.log" Feb 27 01:20:33 crc kubenswrapper[4781]: I0227 01:20:33.727904 4781 scope.go:117] "RemoveContainer" containerID="b2b6fac5723bb6bb5cfc84762685d87a6769151aad24d4f3926a5af565d7efe8" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.089447 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/util/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.417036 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/pull/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.427088 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/pull/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.432393 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/util/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.601778 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/extract/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.640570 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/util/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.675082 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/pull/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.816699 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/util/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.457982 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/util/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.461865 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/pull/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.524350 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/pull/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.707118 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/pull/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.743082 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/util/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.761737 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/extract/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.892967 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/util/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.127039 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/pull/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.141269 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/util/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.150067 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/pull/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.318412 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/util/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.379157 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/pull/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.388976 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/extract/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.507929 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/extract-utilities/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.737064 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/extract-utilities/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.744229 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/extract-content/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.766842 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/extract-content/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.984303 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/extract-utilities/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.994189 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/extract-content/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.281959 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/extract-utilities/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.512699 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/extract-content/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.513101 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/extract-utilities/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.539878 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/extract-content/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.653071 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/registry-server/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.840536 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/extract-content/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.840942 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/extract-utilities/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.913254 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/util/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.460104 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/registry-server/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.474846 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/pull/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.478558 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/util/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.526257 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/pull/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.621729 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/util/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.658931 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/extract/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.680513 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/pull/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.757233 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-h5lrz_672e121e-2b7f-4454-b628-d99032669167/marketplace-operator/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.859594 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/extract-utilities/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.065325 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/extract-utilities/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.066539 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/extract-content/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.069754 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/extract-content/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.235207 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/extract-utilities/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.264702 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/extract-content/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.316376 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/extract-utilities/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.460994 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/registry-server/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.564250 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/extract-content/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.597099 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/extract-utilities/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.605834 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/extract-content/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.822382 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/extract-utilities/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.841878 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/extract-content/0.log" Feb 27 01:20:42 crc kubenswrapper[4781]: I0227 01:20:42.323785 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/registry-server/0.log" Feb 27 01:20:57 crc kubenswrapper[4781]: I0227 01:20:57.668351 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-rbdmr_c62f5f48-b15f-4d70-837c-a05addc48839/prometheus-operator/0.log" Feb 27 01:20:57 crc kubenswrapper[4781]: I0227 01:20:57.709475 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_cbb658fa-808d-4c87-b81e-63863f31382f/prometheus-operator-admission-webhook/0.log" Feb 27 01:20:57 crc kubenswrapper[4781]: I0227 01:20:57.718175 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db/prometheus-operator-admission-webhook/0.log" Feb 27 01:20:57 crc kubenswrapper[4781]: I0227 01:20:57.880920 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-l5ppf_1a3a6a15-797e-4cfe-8e21-3a813460012d/perses-operator/0.log" Feb 27 01:20:57 crc kubenswrapper[4781]: I0227 01:20:57.893718 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-m6jxs_3fe8e5f0-6c7b-42bd-9604-85a90477d143/operator/0.log" Feb 27 01:21:11 crc kubenswrapper[4781]: I0227 01:21:11.056270 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6c4cf64b95-qzbxj_1fed4c33-9f3f-486b-8f74-f2d9a09b92be/kube-rbac-proxy/0.log" Feb 27 01:21:11 crc kubenswrapper[4781]: I0227 01:21:11.161073 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6c4cf64b95-qzbxj_1fed4c33-9f3f-486b-8f74-f2d9a09b92be/manager/0.log" Feb 27 01:21:38 crc kubenswrapper[4781]: I0227 01:21:38.966368 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7qsmt"] Feb 27 01:21:38 crc kubenswrapper[4781]: E0227 01:21:38.974992 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="960f5179-f532-4fbf-90fa-e19414cbe684" containerName="oc" Feb 27 01:21:38 crc kubenswrapper[4781]: I0227 01:21:38.975012 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="960f5179-f532-4fbf-90fa-e19414cbe684" containerName="oc" Feb 27 01:21:38 crc kubenswrapper[4781]: I0227 01:21:38.975247 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="960f5179-f532-4fbf-90fa-e19414cbe684" containerName="oc" Feb 27 01:21:38 crc kubenswrapper[4781]: I0227 01:21:38.976829 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:38 crc kubenswrapper[4781]: I0227 01:21:38.978009 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qsmt"] Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.123977 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppht2\" (UniqueName: \"kubernetes.io/projected/ee5018ff-4da3-4cec-9b08-6a503f0607de-kube-api-access-ppht2\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.124317 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-catalog-content\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.124478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-utilities\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.226069 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppht2\" (UniqueName: \"kubernetes.io/projected/ee5018ff-4da3-4cec-9b08-6a503f0607de-kube-api-access-ppht2\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.226182 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-catalog-content\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.226257 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-utilities\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.226834 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-catalog-content\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.226865 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-utilities\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.255088 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppht2\" (UniqueName: \"kubernetes.io/projected/ee5018ff-4da3-4cec-9b08-6a503f0607de-kube-api-access-ppht2\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.299662 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.802774 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qsmt"] Feb 27 01:21:40 crc kubenswrapper[4781]: I0227 01:21:40.323289 4781 generic.go:334] "Generic (PLEG): container finished" podID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerID="6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e" exitCode=0 Feb 27 01:21:40 crc kubenswrapper[4781]: I0227 01:21:40.323505 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerDied","Data":"6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e"} Feb 27 01:21:40 crc kubenswrapper[4781]: I0227 01:21:40.324555 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerStarted","Data":"414a71779dc04d471984487e92a13a48f31246dd92f455893f57cfae9ff3685e"} Feb 27 01:21:42 crc kubenswrapper[4781]: I0227 01:21:42.344356 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerStarted","Data":"12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c"} Feb 27 01:21:42 crc kubenswrapper[4781]: I0227 01:21:42.895551 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:21:42 crc kubenswrapper[4781]: I0227 01:21:42.895958 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:21:48 crc kubenswrapper[4781]: I0227 01:21:48.400076 4781 generic.go:334] "Generic (PLEG): container finished" podID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerID="12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c" exitCode=0 Feb 27 01:21:48 crc kubenswrapper[4781]: I0227 01:21:48.400154 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerDied","Data":"12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c"} Feb 27 01:21:49 crc kubenswrapper[4781]: I0227 01:21:49.413218 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerStarted","Data":"a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e"} Feb 27 01:21:59 crc kubenswrapper[4781]: I0227 01:21:59.300582 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:59 crc kubenswrapper[4781]: I0227 01:21:59.301390 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:59 crc kubenswrapper[4781]: I0227 01:21:59.347174 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:59 crc kubenswrapper[4781]: I0227 01:21:59.366762 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7qsmt" podStartSLOduration=12.898298925 podStartE2EDuration="21.366740654s" podCreationTimestamp="2026-02-27 01:21:38 +0000 UTC" firstStartedPulling="2026-02-27 01:21:40.326017376 +0000 UTC m=+4569.583556930" lastFinishedPulling="2026-02-27 01:21:48.794459105 +0000 UTC m=+4578.051998659" observedRunningTime="2026-02-27 01:21:49.435937588 +0000 UTC m=+4578.693477152" watchObservedRunningTime="2026-02-27 01:21:59.366740654 +0000 UTC m=+4588.624280208" Feb 27 01:21:59 crc kubenswrapper[4781]: I0227 01:21:59.572619 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:59 crc kubenswrapper[4781]: I0227 01:21:59.614715 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qsmt"] Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.154795 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535922-v2mrc"] Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.156465 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.159535 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.159901 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.160142 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.166911 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535922-v2mrc"] Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.278509 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzb6n\" (UniqueName: \"kubernetes.io/projected/a4de45e6-34d0-42f2-a5ef-3db90864a559-kube-api-access-xzb6n\") pod \"auto-csr-approver-29535922-v2mrc\" (UID: \"a4de45e6-34d0-42f2-a5ef-3db90864a559\") " pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.381057 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzb6n\" (UniqueName: \"kubernetes.io/projected/a4de45e6-34d0-42f2-a5ef-3db90864a559-kube-api-access-xzb6n\") pod \"auto-csr-approver-29535922-v2mrc\" (UID: \"a4de45e6-34d0-42f2-a5ef-3db90864a559\") " pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.406561 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzb6n\" (UniqueName: \"kubernetes.io/projected/a4de45e6-34d0-42f2-a5ef-3db90864a559-kube-api-access-xzb6n\") pod \"auto-csr-approver-29535922-v2mrc\" (UID: \"a4de45e6-34d0-42f2-a5ef-3db90864a559\") " pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.485977 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:00 crc kubenswrapper[4781]: W0227 01:22:00.990667 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4de45e6_34d0_42f2_a5ef_3db90864a559.slice/crio-8740226de6b5e5ac6030941401dcadd96358bc95163609424d4139c880f66926 WatchSource:0}: Error finding container 8740226de6b5e5ac6030941401dcadd96358bc95163609424d4139c880f66926: Status 404 returned error can't find the container with id 8740226de6b5e5ac6030941401dcadd96358bc95163609424d4139c880f66926 Feb 27 01:22:01 crc kubenswrapper[4781]: I0227 01:22:01.002948 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535922-v2mrc"] Feb 27 01:22:01 crc kubenswrapper[4781]: I0227 01:22:01.549403 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7qsmt" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="registry-server" containerID="cri-o://a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e" gracePeriod=2 Feb 27 01:22:01 crc kubenswrapper[4781]: I0227 01:22:01.549810 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" event={"ID":"a4de45e6-34d0-42f2-a5ef-3db90864a559","Type":"ContainerStarted","Data":"8740226de6b5e5ac6030941401dcadd96358bc95163609424d4139c880f66926"} Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.333282 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.436716 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-utilities\") pod \"ee5018ff-4da3-4cec-9b08-6a503f0607de\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.437112 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-catalog-content\") pod \"ee5018ff-4da3-4cec-9b08-6a503f0607de\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.437169 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppht2\" (UniqueName: \"kubernetes.io/projected/ee5018ff-4da3-4cec-9b08-6a503f0607de-kube-api-access-ppht2\") pod \"ee5018ff-4da3-4cec-9b08-6a503f0607de\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.438330 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-utilities" (OuterVolumeSpecName: "utilities") pod "ee5018ff-4da3-4cec-9b08-6a503f0607de" (UID: "ee5018ff-4da3-4cec-9b08-6a503f0607de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.446536 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee5018ff-4da3-4cec-9b08-6a503f0607de-kube-api-access-ppht2" (OuterVolumeSpecName: "kube-api-access-ppht2") pod "ee5018ff-4da3-4cec-9b08-6a503f0607de" (UID: "ee5018ff-4da3-4cec-9b08-6a503f0607de"). InnerVolumeSpecName "kube-api-access-ppht2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.539482 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppht2\" (UniqueName: \"kubernetes.io/projected/ee5018ff-4da3-4cec-9b08-6a503f0607de-kube-api-access-ppht2\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.539937 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.560399 4781 generic.go:334] "Generic (PLEG): container finished" podID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerID="a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e" exitCode=0 Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.560468 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerDied","Data":"a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e"} Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.560502 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerDied","Data":"414a71779dc04d471984487e92a13a48f31246dd92f455893f57cfae9ff3685e"} Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.560542 4781 scope.go:117] "RemoveContainer" containerID="a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.560705 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.565121 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" event={"ID":"a4de45e6-34d0-42f2-a5ef-3db90864a559","Type":"ContainerStarted","Data":"2f2dfc7d2070f93d2a77b14a366a4665e36d3d475590bb3af0ca699d7f8bebd2"} Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.582201 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" podStartSLOduration=1.647149111 podStartE2EDuration="2.582177329s" podCreationTimestamp="2026-02-27 01:22:00 +0000 UTC" firstStartedPulling="2026-02-27 01:22:00.993990944 +0000 UTC m=+4590.251530498" lastFinishedPulling="2026-02-27 01:22:01.929019162 +0000 UTC m=+4591.186558716" observedRunningTime="2026-02-27 01:22:02.580617177 +0000 UTC m=+4591.838156731" watchObservedRunningTime="2026-02-27 01:22:02.582177329 +0000 UTC m=+4591.839716883" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.584604 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee5018ff-4da3-4cec-9b08-6a503f0607de" (UID: "ee5018ff-4da3-4cec-9b08-6a503f0607de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.587107 4781 scope.go:117] "RemoveContainer" containerID="12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.610607 4781 scope.go:117] "RemoveContainer" containerID="6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.634301 4781 scope.go:117] "RemoveContainer" containerID="a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e" Feb 27 01:22:02 crc kubenswrapper[4781]: E0227 01:22:02.634955 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e\": container with ID starting with a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e not found: ID does not exist" containerID="a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.634986 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e"} err="failed to get container status \"a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e\": rpc error: code = NotFound desc = could not find container \"a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e\": container with ID starting with a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e not found: ID does not exist" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.635006 4781 scope.go:117] "RemoveContainer" containerID="12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c" Feb 27 01:22:02 crc kubenswrapper[4781]: E0227 01:22:02.635296 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c\": container with ID starting with 12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c not found: ID does not exist" containerID="12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.635377 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c"} err="failed to get container status \"12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c\": rpc error: code = NotFound desc = could not find container \"12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c\": container with ID starting with 12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c not found: ID does not exist" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.635442 4781 scope.go:117] "RemoveContainer" containerID="6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e" Feb 27 01:22:02 crc kubenswrapper[4781]: E0227 01:22:02.635819 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e\": container with ID starting with 6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e not found: ID does not exist" containerID="6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.635847 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e"} err="failed to get container status \"6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e\": rpc error: code = NotFound desc = could not find container \"6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e\": container with ID starting with 6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e not found: ID does not exist" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.642564 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.903045 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qsmt"] Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.913604 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7qsmt"] Feb 27 01:22:03 crc kubenswrapper[4781]: I0227 01:22:03.322386 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" path="/var/lib/kubelet/pods/ee5018ff-4da3-4cec-9b08-6a503f0607de/volumes" Feb 27 01:22:03 crc kubenswrapper[4781]: I0227 01:22:03.578238 4781 generic.go:334] "Generic (PLEG): container finished" podID="a4de45e6-34d0-42f2-a5ef-3db90864a559" containerID="2f2dfc7d2070f93d2a77b14a366a4665e36d3d475590bb3af0ca699d7f8bebd2" exitCode=0 Feb 27 01:22:03 crc kubenswrapper[4781]: I0227 01:22:03.578285 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" event={"ID":"a4de45e6-34d0-42f2-a5ef-3db90864a559","Type":"ContainerDied","Data":"2f2dfc7d2070f93d2a77b14a366a4665e36d3d475590bb3af0ca699d7f8bebd2"} Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.053498 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.191055 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzb6n\" (UniqueName: \"kubernetes.io/projected/a4de45e6-34d0-42f2-a5ef-3db90864a559-kube-api-access-xzb6n\") pod \"a4de45e6-34d0-42f2-a5ef-3db90864a559\" (UID: \"a4de45e6-34d0-42f2-a5ef-3db90864a559\") " Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.197382 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4de45e6-34d0-42f2-a5ef-3db90864a559-kube-api-access-xzb6n" (OuterVolumeSpecName: "kube-api-access-xzb6n") pod "a4de45e6-34d0-42f2-a5ef-3db90864a559" (UID: "a4de45e6-34d0-42f2-a5ef-3db90864a559"). InnerVolumeSpecName "kube-api-access-xzb6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.293992 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzb6n\" (UniqueName: \"kubernetes.io/projected/a4de45e6-34d0-42f2-a5ef-3db90864a559-kube-api-access-xzb6n\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.595100 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" event={"ID":"a4de45e6-34d0-42f2-a5ef-3db90864a559","Type":"ContainerDied","Data":"8740226de6b5e5ac6030941401dcadd96358bc95163609424d4139c880f66926"} Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.595145 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8740226de6b5e5ac6030941401dcadd96358bc95163609424d4139c880f66926" Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.595150 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.650876 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535916-5rslt"] Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.663064 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535916-5rslt"] Feb 27 01:22:07 crc kubenswrapper[4781]: I0227 01:22:07.321508 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deca34b2-a27c-46b9-bbe3-ac2d08a7a72e" path="/var/lib/kubelet/pods/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e/volumes" Feb 27 01:22:12 crc kubenswrapper[4781]: I0227 01:22:12.895450 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:22:12 crc kubenswrapper[4781]: I0227 01:22:12.895993 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.372321 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zbdrn"] Feb 27 01:22:31 crc kubenswrapper[4781]: E0227 01:22:31.373248 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="extract-utilities" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.373261 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="extract-utilities" Feb 27 01:22:31 crc kubenswrapper[4781]: E0227 01:22:31.373279 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="extract-content" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.373285 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="extract-content" Feb 27 01:22:31 crc kubenswrapper[4781]: E0227 01:22:31.373301 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="registry-server" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.373308 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="registry-server" Feb 27 01:22:31 crc kubenswrapper[4781]: E0227 01:22:31.373336 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4de45e6-34d0-42f2-a5ef-3db90864a559" containerName="oc" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.373342 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4de45e6-34d0-42f2-a5ef-3db90864a559" containerName="oc" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.373537 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4de45e6-34d0-42f2-a5ef-3db90864a559" containerName="oc" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.373564 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="registry-server" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.375119 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.391563 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbdrn"] Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.445712 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-utilities\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.445809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2n4m\" (UniqueName: \"kubernetes.io/projected/219ad386-328f-4166-a266-c28815b457f5-kube-api-access-f2n4m\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.445846 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-catalog-content\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.547971 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-utilities\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.548058 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2n4m\" (UniqueName: \"kubernetes.io/projected/219ad386-328f-4166-a266-c28815b457f5-kube-api-access-f2n4m\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.548101 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-catalog-content\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.548555 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-utilities\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.548566 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-catalog-content\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.568788 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2n4m\" (UniqueName: \"kubernetes.io/projected/219ad386-328f-4166-a266-c28815b457f5-kube-api-access-f2n4m\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.694992 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:32 crc kubenswrapper[4781]: I0227 01:22:32.257339 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbdrn"] Feb 27 01:22:32 crc kubenswrapper[4781]: I0227 01:22:32.874079 4781 generic.go:334] "Generic (PLEG): container finished" podID="219ad386-328f-4166-a266-c28815b457f5" containerID="2462823339fca04753fa339fcde4fab6b795ec9ca5111df0b4be4d7401713029" exitCode=0 Feb 27 01:22:32 crc kubenswrapper[4781]: I0227 01:22:32.874124 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerDied","Data":"2462823339fca04753fa339fcde4fab6b795ec9ca5111df0b4be4d7401713029"} Feb 27 01:22:32 crc kubenswrapper[4781]: I0227 01:22:32.874500 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerStarted","Data":"ac148faebdbe3f69c3c48192b284b267a5615dd867270aef97bbadbb2eee1df2"} Feb 27 01:22:33 crc kubenswrapper[4781]: I0227 01:22:33.832912 4781 scope.go:117] "RemoveContainer" containerID="a89d93284b5be38596ce103c331565c9dbf5be828da69afb3c56f041c046abb6" Feb 27 01:22:33 crc kubenswrapper[4781]: I0227 01:22:33.876562 4781 scope.go:117] "RemoveContainer" containerID="5429009dce4ed7561680c8a6236f2fd38e0d3ba334a4b82f95acb92d3f8dce94" Feb 27 01:22:33 crc kubenswrapper[4781]: I0227 01:22:33.911524 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerStarted","Data":"455e0a696ef59f3cb89f3818dfa8c96ddb587193ce3da25f9680c0d5023ff776"} Feb 27 01:22:35 crc kubenswrapper[4781]: I0227 01:22:35.931680 4781 generic.go:334] "Generic (PLEG): container finished" podID="219ad386-328f-4166-a266-c28815b457f5" containerID="455e0a696ef59f3cb89f3818dfa8c96ddb587193ce3da25f9680c0d5023ff776" exitCode=0 Feb 27 01:22:35 crc kubenswrapper[4781]: I0227 01:22:35.931759 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerDied","Data":"455e0a696ef59f3cb89f3818dfa8c96ddb587193ce3da25f9680c0d5023ff776"} Feb 27 01:22:36 crc kubenswrapper[4781]: I0227 01:22:36.944302 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerStarted","Data":"ffeaba9e07befd01e497bb0680cce40a54fb996db73d3bbe5052d97e412313ac"} Feb 27 01:22:36 crc kubenswrapper[4781]: I0227 01:22:36.976069 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zbdrn" podStartSLOduration=2.477697144 podStartE2EDuration="5.976044528s" podCreationTimestamp="2026-02-27 01:22:31 +0000 UTC" firstStartedPulling="2026-02-27 01:22:32.875874961 +0000 UTC m=+4622.133414515" lastFinishedPulling="2026-02-27 01:22:36.374222345 +0000 UTC m=+4625.631761899" observedRunningTime="2026-02-27 01:22:36.962814894 +0000 UTC m=+4626.220354448" watchObservedRunningTime="2026-02-27 01:22:36.976044528 +0000 UTC m=+4626.233584082" Feb 27 01:22:41 crc kubenswrapper[4781]: I0227 01:22:41.697922 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:41 crc kubenswrapper[4781]: I0227 01:22:41.698655 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:41 crc kubenswrapper[4781]: I0227 01:22:41.742331 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.043616 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.096501 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbdrn"] Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.896111 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.896460 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.896506 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.897307 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.897371 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" gracePeriod=600 Feb 27 01:22:43 crc kubenswrapper[4781]: E0227 01:22:43.037375 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:22:44 crc kubenswrapper[4781]: I0227 01:22:44.013657 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" exitCode=0 Feb 27 01:22:44 crc kubenswrapper[4781]: I0227 01:22:44.013736 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f"} Feb 27 01:22:44 crc kubenswrapper[4781]: I0227 01:22:44.015313 4781 scope.go:117] "RemoveContainer" containerID="93857194fe96d9ea4ad88dce6987b56ca3a1bbc406106d6f82950d6a036e6c83" Feb 27 01:22:44 crc kubenswrapper[4781]: I0227 01:22:44.016128 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zbdrn" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="registry-server" containerID="cri-o://ffeaba9e07befd01e497bb0680cce40a54fb996db73d3bbe5052d97e412313ac" gracePeriod=2 Feb 27 01:22:44 crc kubenswrapper[4781]: I0227 01:22:44.016154 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:22:44 crc kubenswrapper[4781]: E0227 01:22:44.016741 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.030078 4781 generic.go:334] "Generic (PLEG): container finished" podID="219ad386-328f-4166-a266-c28815b457f5" containerID="ffeaba9e07befd01e497bb0680cce40a54fb996db73d3bbe5052d97e412313ac" exitCode=0 Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.030136 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerDied","Data":"ffeaba9e07befd01e497bb0680cce40a54fb996db73d3bbe5052d97e412313ac"} Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.307508 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.337346 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2n4m\" (UniqueName: \"kubernetes.io/projected/219ad386-328f-4166-a266-c28815b457f5-kube-api-access-f2n4m\") pod \"219ad386-328f-4166-a266-c28815b457f5\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.349655 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219ad386-328f-4166-a266-c28815b457f5-kube-api-access-f2n4m" (OuterVolumeSpecName: "kube-api-access-f2n4m") pod "219ad386-328f-4166-a266-c28815b457f5" (UID: "219ad386-328f-4166-a266-c28815b457f5"). InnerVolumeSpecName "kube-api-access-f2n4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.441904 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-utilities\") pod \"219ad386-328f-4166-a266-c28815b457f5\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.441990 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-catalog-content\") pod \"219ad386-328f-4166-a266-c28815b457f5\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.445691 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2n4m\" (UniqueName: \"kubernetes.io/projected/219ad386-328f-4166-a266-c28815b457f5-kube-api-access-f2n4m\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.447945 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-utilities" (OuterVolumeSpecName: "utilities") pod "219ad386-328f-4166-a266-c28815b457f5" (UID: "219ad386-328f-4166-a266-c28815b457f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.486911 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "219ad386-328f-4166-a266-c28815b457f5" (UID: "219ad386-328f-4166-a266-c28815b457f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.547475 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.547519 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.042169 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerDied","Data":"ac148faebdbe3f69c3c48192b284b267a5615dd867270aef97bbadbb2eee1df2"} Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.042225 4781 scope.go:117] "RemoveContainer" containerID="ffeaba9e07befd01e497bb0680cce40a54fb996db73d3bbe5052d97e412313ac" Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.042238 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.060982 4781 scope.go:117] "RemoveContainer" containerID="455e0a696ef59f3cb89f3818dfa8c96ddb587193ce3da25f9680c0d5023ff776" Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.082268 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbdrn"] Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.093976 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbdrn"] Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.393775 4781 scope.go:117] "RemoveContainer" containerID="2462823339fca04753fa339fcde4fab6b795ec9ca5111df0b4be4d7401713029" Feb 27 01:22:47 crc kubenswrapper[4781]: I0227 01:22:47.323492 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219ad386-328f-4166-a266-c28815b457f5" path="/var/lib/kubelet/pods/219ad386-328f-4166-a266-c28815b457f5/volumes" Feb 27 01:22:55 crc kubenswrapper[4781]: I0227 01:22:55.310415 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:22:55 crc kubenswrapper[4781]: E0227 01:22:55.311429 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:23:09 crc kubenswrapper[4781]: I0227 01:23:09.313777 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:23:09 crc kubenswrapper[4781]: E0227 01:23:09.320194 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:23:10 crc kubenswrapper[4781]: I0227 01:23:10.266535 4781 generic.go:334] "Generic (PLEG): container finished" podID="03276b70-f5f8-486f-beb1-070a017efc66" containerID="8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81" exitCode=0 Feb 27 01:23:10 crc kubenswrapper[4781]: I0227 01:23:10.266638 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/must-gather-b97zf" event={"ID":"03276b70-f5f8-486f-beb1-070a017efc66","Type":"ContainerDied","Data":"8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81"} Feb 27 01:23:10 crc kubenswrapper[4781]: I0227 01:23:10.267828 4781 scope.go:117] "RemoveContainer" containerID="8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81" Feb 27 01:23:10 crc kubenswrapper[4781]: I0227 01:23:10.510573 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vvzsl_must-gather-b97zf_03276b70-f5f8-486f-beb1-070a017efc66/gather/0.log" Feb 27 01:23:19 crc kubenswrapper[4781]: I0227 01:23:19.655446 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vvzsl/must-gather-b97zf"] Feb 27 01:23:19 crc kubenswrapper[4781]: I0227 01:23:19.656069 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vvzsl/must-gather-b97zf" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="copy" containerID="cri-o://7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234" gracePeriod=2 Feb 27 01:23:19 crc kubenswrapper[4781]: I0227 01:23:19.666242 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vvzsl/must-gather-b97zf"] Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.305069 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vvzsl_must-gather-b97zf_03276b70-f5f8-486f-beb1-070a017efc66/copy/0.log" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.305895 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.371037 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vvzsl_must-gather-b97zf_03276b70-f5f8-486f-beb1-070a017efc66/copy/0.log" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.371812 4781 generic.go:334] "Generic (PLEG): container finished" podID="03276b70-f5f8-486f-beb1-070a017efc66" containerID="7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234" exitCode=143 Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.371864 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.371879 4781 scope.go:117] "RemoveContainer" containerID="7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.392408 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6k9r\" (UniqueName: \"kubernetes.io/projected/03276b70-f5f8-486f-beb1-070a017efc66-kube-api-access-q6k9r\") pod \"03276b70-f5f8-486f-beb1-070a017efc66\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.392539 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/03276b70-f5f8-486f-beb1-070a017efc66-must-gather-output\") pod \"03276b70-f5f8-486f-beb1-070a017efc66\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.396952 4781 scope.go:117] "RemoveContainer" containerID="8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.407141 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03276b70-f5f8-486f-beb1-070a017efc66-kube-api-access-q6k9r" (OuterVolumeSpecName: "kube-api-access-q6k9r") pod "03276b70-f5f8-486f-beb1-070a017efc66" (UID: "03276b70-f5f8-486f-beb1-070a017efc66"). InnerVolumeSpecName "kube-api-access-q6k9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.495698 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6k9r\" (UniqueName: \"kubernetes.io/projected/03276b70-f5f8-486f-beb1-070a017efc66-kube-api-access-q6k9r\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.544527 4781 scope.go:117] "RemoveContainer" containerID="7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234" Feb 27 01:23:20 crc kubenswrapper[4781]: E0227 01:23:20.545143 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234\": container with ID starting with 7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234 not found: ID does not exist" containerID="7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.545191 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234"} err="failed to get container status \"7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234\": rpc error: code = NotFound desc = could not find container \"7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234\": container with ID starting with 7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234 not found: ID does not exist" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.545224 4781 scope.go:117] "RemoveContainer" containerID="8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81" Feb 27 01:23:20 crc kubenswrapper[4781]: E0227 01:23:20.545724 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81\": container with ID starting with 8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81 not found: ID does not exist" containerID="8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.545755 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81"} err="failed to get container status \"8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81\": rpc error: code = NotFound desc = could not find container \"8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81\": container with ID starting with 8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81 not found: ID does not exist" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.582583 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03276b70-f5f8-486f-beb1-070a017efc66-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "03276b70-f5f8-486f-beb1-070a017efc66" (UID: "03276b70-f5f8-486f-beb1-070a017efc66"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.597410 4781 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/03276b70-f5f8-486f-beb1-070a017efc66-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:21 crc kubenswrapper[4781]: I0227 01:23:21.321241 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:23:21 crc kubenswrapper[4781]: E0227 01:23:21.321569 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:23:21 crc kubenswrapper[4781]: I0227 01:23:21.326572 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03276b70-f5f8-486f-beb1-070a017efc66" path="/var/lib/kubelet/pods/03276b70-f5f8-486f-beb1-070a017efc66/volumes" Feb 27 01:23:33 crc kubenswrapper[4781]: I0227 01:23:33.312805 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:23:33 crc kubenswrapper[4781]: E0227 01:23:33.313714 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.015145 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6fkgd"] Feb 27 01:23:38 crc kubenswrapper[4781]: E0227 01:23:38.015989 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="extract-content" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016003 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="extract-content" Feb 27 01:23:38 crc kubenswrapper[4781]: E0227 01:23:38.016026 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="copy" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016032 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="copy" Feb 27 01:23:38 crc kubenswrapper[4781]: E0227 01:23:38.016051 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="gather" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016057 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="gather" Feb 27 01:23:38 crc kubenswrapper[4781]: E0227 01:23:38.016071 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="extract-utilities" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016076 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="extract-utilities" Feb 27 01:23:38 crc kubenswrapper[4781]: E0227 01:23:38.016088 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="registry-server" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016093 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="registry-server" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016280 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="copy" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016291 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="registry-server" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016300 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="gather" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.017798 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.031498 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6fkgd"] Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.152005 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-catalog-content\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.152304 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-utilities\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.152565 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx9bj\" (UniqueName: \"kubernetes.io/projected/dff525c7-90db-4e5e-b13a-33b5dbfdb372-kube-api-access-mx9bj\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.254929 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-catalog-content\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.255012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-utilities\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.255108 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx9bj\" (UniqueName: \"kubernetes.io/projected/dff525c7-90db-4e5e-b13a-33b5dbfdb372-kube-api-access-mx9bj\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.255666 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-utilities\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.255953 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-catalog-content\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.275386 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx9bj\" (UniqueName: \"kubernetes.io/projected/dff525c7-90db-4e5e-b13a-33b5dbfdb372-kube-api-access-mx9bj\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.337730 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.894583 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6fkgd"] Feb 27 01:23:39 crc kubenswrapper[4781]: I0227 01:23:39.573704 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerStarted","Data":"87d71a1b1a7c9221c815c99101feb08c7fff66f9d9415c51ea1d9ba83fed28e5"} Feb 27 01:23:40 crc kubenswrapper[4781]: I0227 01:23:40.589177 4781 generic.go:334] "Generic (PLEG): container finished" podID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerID="a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241" exitCode=0 Feb 27 01:23:40 crc kubenswrapper[4781]: I0227 01:23:40.589235 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerDied","Data":"a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241"} Feb 27 01:23:40 crc kubenswrapper[4781]: I0227 01:23:40.592260 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:23:42 crc kubenswrapper[4781]: I0227 01:23:42.608113 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerStarted","Data":"c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5"} Feb 27 01:23:45 crc kubenswrapper[4781]: I0227 01:23:45.651876 4781 generic.go:334] "Generic (PLEG): container finished" podID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerID="c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5" exitCode=0 Feb 27 01:23:45 crc kubenswrapper[4781]: I0227 01:23:45.652078 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerDied","Data":"c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5"} Feb 27 01:23:46 crc kubenswrapper[4781]: I0227 01:23:46.665902 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerStarted","Data":"8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da"} Feb 27 01:23:46 crc kubenswrapper[4781]: I0227 01:23:46.686006 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6fkgd" podStartSLOduration=4.173127472 podStartE2EDuration="9.685990132s" podCreationTimestamp="2026-02-27 01:23:37 +0000 UTC" firstStartedPulling="2026-02-27 01:23:40.591974928 +0000 UTC m=+4689.849514482" lastFinishedPulling="2026-02-27 01:23:46.104837588 +0000 UTC m=+4695.362377142" observedRunningTime="2026-02-27 01:23:46.683583489 +0000 UTC m=+4695.941123043" watchObservedRunningTime="2026-02-27 01:23:46.685990132 +0000 UTC m=+4695.943529676" Feb 27 01:23:48 crc kubenswrapper[4781]: I0227 01:23:48.309865 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:23:48 crc kubenswrapper[4781]: E0227 01:23:48.310573 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:23:48 crc kubenswrapper[4781]: I0227 01:23:48.338122 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:48 crc kubenswrapper[4781]: I0227 01:23:48.338225 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:48 crc kubenswrapper[4781]: I0227 01:23:48.385040 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:58 crc kubenswrapper[4781]: I0227 01:23:58.386727 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:58 crc kubenswrapper[4781]: I0227 01:23:58.440078 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6fkgd"] Feb 27 01:23:58 crc kubenswrapper[4781]: I0227 01:23:58.774778 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6fkgd" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="registry-server" containerID="cri-o://8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da" gracePeriod=2 Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.495324 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.558222 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-catalog-content\") pod \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.558288 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx9bj\" (UniqueName: \"kubernetes.io/projected/dff525c7-90db-4e5e-b13a-33b5dbfdb372-kube-api-access-mx9bj\") pod \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.558452 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-utilities\") pod \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.559165 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-utilities" (OuterVolumeSpecName: "utilities") pod "dff525c7-90db-4e5e-b13a-33b5dbfdb372" (UID: "dff525c7-90db-4e5e-b13a-33b5dbfdb372"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.564446 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff525c7-90db-4e5e-b13a-33b5dbfdb372-kube-api-access-mx9bj" (OuterVolumeSpecName: "kube-api-access-mx9bj") pod "dff525c7-90db-4e5e-b13a-33b5dbfdb372" (UID: "dff525c7-90db-4e5e-b13a-33b5dbfdb372"). InnerVolumeSpecName "kube-api-access-mx9bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.613395 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dff525c7-90db-4e5e-b13a-33b5dbfdb372" (UID: "dff525c7-90db-4e5e-b13a-33b5dbfdb372"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.661318 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.661372 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.661397 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx9bj\" (UniqueName: \"kubernetes.io/projected/dff525c7-90db-4e5e-b13a-33b5dbfdb372-kube-api-access-mx9bj\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.785404 4781 generic.go:334] "Generic (PLEG): container finished" podID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerID="8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da" exitCode=0 Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.785464 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.785479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerDied","Data":"8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da"} Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.786352 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerDied","Data":"87d71a1b1a7c9221c815c99101feb08c7fff66f9d9415c51ea1d9ba83fed28e5"} Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.786396 4781 scope.go:117] "RemoveContainer" containerID="8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.807011 4781 scope.go:117] "RemoveContainer" containerID="c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.824707 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6fkgd"] Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.831090 4781 scope.go:117] "RemoveContainer" containerID="a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.837676 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6fkgd"] Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.900524 4781 scope.go:117] "RemoveContainer" containerID="8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da" Feb 27 01:23:59 crc kubenswrapper[4781]: E0227 01:23:59.901342 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da\": container with ID starting with 8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da not found: ID does not exist" containerID="8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.901424 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da"} err="failed to get container status \"8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da\": rpc error: code = NotFound desc = could not find container \"8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da\": container with ID starting with 8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da not found: ID does not exist" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.901458 4781 scope.go:117] "RemoveContainer" containerID="c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5" Feb 27 01:23:59 crc kubenswrapper[4781]: E0227 01:23:59.901981 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5\": container with ID starting with c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5 not found: ID does not exist" containerID="c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.902025 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5"} err="failed to get container status \"c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5\": rpc error: code = NotFound desc = could not find container \"c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5\": container with ID starting with c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5 not found: ID does not exist" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.902053 4781 scope.go:117] "RemoveContainer" containerID="a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241" Feb 27 01:23:59 crc kubenswrapper[4781]: E0227 01:23:59.902467 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241\": container with ID starting with a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241 not found: ID does not exist" containerID="a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.902521 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241"} err="failed to get container status \"a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241\": rpc error: code = NotFound desc = could not find container \"a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241\": container with ID starting with a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241 not found: ID does not exist" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.147408 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535924-jqt2s"] Feb 27 01:24:00 crc kubenswrapper[4781]: E0227 01:24:00.148218 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="extract-utilities" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.148242 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="extract-utilities" Feb 27 01:24:00 crc kubenswrapper[4781]: E0227 01:24:00.148261 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="extract-content" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.148268 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="extract-content" Feb 27 01:24:00 crc kubenswrapper[4781]: E0227 01:24:00.148287 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="registry-server" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.148294 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="registry-server" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.148546 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="registry-server" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.149348 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.152244 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.152554 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.153172 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.169397 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535924-jqt2s"] Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.275598 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdfnz\" (UniqueName: \"kubernetes.io/projected/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6-kube-api-access-fdfnz\") pod \"auto-csr-approver-29535924-jqt2s\" (UID: \"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6\") " pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.379526 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdfnz\" (UniqueName: \"kubernetes.io/projected/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6-kube-api-access-fdfnz\") pod \"auto-csr-approver-29535924-jqt2s\" (UID: \"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6\") " pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.399103 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdfnz\" (UniqueName: \"kubernetes.io/projected/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6-kube-api-access-fdfnz\") pod \"auto-csr-approver-29535924-jqt2s\" (UID: \"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6\") " pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.481241 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:01 crc kubenswrapper[4781]: I0227 01:24:01.334788 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:24:01 crc kubenswrapper[4781]: E0227 01:24:01.336465 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:24:01 crc kubenswrapper[4781]: I0227 01:24:01.334812 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" path="/var/lib/kubelet/pods/dff525c7-90db-4e5e-b13a-33b5dbfdb372/volumes" Feb 27 01:24:01 crc kubenswrapper[4781]: I0227 01:24:01.438770 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535924-jqt2s"] Feb 27 01:24:01 crc kubenswrapper[4781]: W0227 01:24:01.441979 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d4094f7_d1b4_4771_8d18_ea76f4f2afc6.slice/crio-a3993c2ada8a174086507fb13e67deb1f2d4005fa3eaa96d5e20fa8862acfd4c WatchSource:0}: Error finding container a3993c2ada8a174086507fb13e67deb1f2d4005fa3eaa96d5e20fa8862acfd4c: Status 404 returned error can't find the container with id a3993c2ada8a174086507fb13e67deb1f2d4005fa3eaa96d5e20fa8862acfd4c Feb 27 01:24:01 crc kubenswrapper[4781]: I0227 01:24:01.807798 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" event={"ID":"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6","Type":"ContainerStarted","Data":"a3993c2ada8a174086507fb13e67deb1f2d4005fa3eaa96d5e20fa8862acfd4c"} Feb 27 01:24:03 crc kubenswrapper[4781]: I0227 01:24:03.828424 4781 generic.go:334] "Generic (PLEG): container finished" podID="5d4094f7-d1b4-4771-8d18-ea76f4f2afc6" containerID="ddf9edbab834e6c1a55e9fec04cd3a8734214a1dea92a8471be43663e726da2d" exitCode=0 Feb 27 01:24:03 crc kubenswrapper[4781]: I0227 01:24:03.828483 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" event={"ID":"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6","Type":"ContainerDied","Data":"ddf9edbab834e6c1a55e9fec04cd3a8734214a1dea92a8471be43663e726da2d"} Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.372379 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.487569 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdfnz\" (UniqueName: \"kubernetes.io/projected/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6-kube-api-access-fdfnz\") pod \"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6\" (UID: \"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6\") " Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.496260 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6-kube-api-access-fdfnz" (OuterVolumeSpecName: "kube-api-access-fdfnz") pod "5d4094f7-d1b4-4771-8d18-ea76f4f2afc6" (UID: "5d4094f7-d1b4-4771-8d18-ea76f4f2afc6"). InnerVolumeSpecName "kube-api-access-fdfnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.590578 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdfnz\" (UniqueName: \"kubernetes.io/projected/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6-kube-api-access-fdfnz\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.849131 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" event={"ID":"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6","Type":"ContainerDied","Data":"a3993c2ada8a174086507fb13e67deb1f2d4005fa3eaa96d5e20fa8862acfd4c"} Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.849189 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3993c2ada8a174086507fb13e67deb1f2d4005fa3eaa96d5e20fa8862acfd4c" Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.849200 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:06 crc kubenswrapper[4781]: I0227 01:24:06.449478 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535918-hlgxs"] Feb 27 01:24:06 crc kubenswrapper[4781]: I0227 01:24:06.461753 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535918-hlgxs"] Feb 27 01:24:07 crc kubenswrapper[4781]: I0227 01:24:07.321832 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93462151-bfc8-4c6a-8d83-adc55e0b038c" path="/var/lib/kubelet/pods/93462151-bfc8-4c6a-8d83-adc55e0b038c/volumes" Feb 27 01:24:14 crc kubenswrapper[4781]: I0227 01:24:14.309853 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:24:14 crc kubenswrapper[4781]: E0227 01:24:14.310699 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.291223 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8jb27"] Feb 27 01:24:19 crc kubenswrapper[4781]: E0227 01:24:19.292157 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4094f7-d1b4-4771-8d18-ea76f4f2afc6" containerName="oc" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.292170 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4094f7-d1b4-4771-8d18-ea76f4f2afc6" containerName="oc" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.292365 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4094f7-d1b4-4771-8d18-ea76f4f2afc6" containerName="oc" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.293849 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.324960 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jb27"] Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.383819 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-catalog-content\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.384032 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmp5c\" (UniqueName: \"kubernetes.io/projected/c4ead436-ddbf-4703-971c-12f3b1a5673e-kube-api-access-zmp5c\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.384053 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-utilities\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.485777 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmp5c\" (UniqueName: \"kubernetes.io/projected/c4ead436-ddbf-4703-971c-12f3b1a5673e-kube-api-access-zmp5c\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.485836 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-utilities\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.485966 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-catalog-content\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.486368 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-utilities\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.486443 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-catalog-content\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.510402 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmp5c\" (UniqueName: \"kubernetes.io/projected/c4ead436-ddbf-4703-971c-12f3b1a5673e-kube-api-access-zmp5c\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.618588 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:20 crc kubenswrapper[4781]: I0227 01:24:20.266059 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jb27"] Feb 27 01:24:20 crc kubenswrapper[4781]: I0227 01:24:20.993990 4781 generic.go:334] "Generic (PLEG): container finished" podID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerID="e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20" exitCode=0 Feb 27 01:24:20 crc kubenswrapper[4781]: I0227 01:24:20.994088 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerDied","Data":"e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20"} Feb 27 01:24:20 crc kubenswrapper[4781]: I0227 01:24:20.994127 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerStarted","Data":"5c47e43f064528bc2831a2b9ccb5e3b1b6a8041ff6b8a9408e4fb27c0e6a7ceb"} Feb 27 01:24:23 crc kubenswrapper[4781]: I0227 01:24:23.013881 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerStarted","Data":"128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d"} Feb 27 01:24:25 crc kubenswrapper[4781]: I0227 01:24:25.056705 4781 generic.go:334] "Generic (PLEG): container finished" podID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerID="128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d" exitCode=0 Feb 27 01:24:25 crc kubenswrapper[4781]: I0227 01:24:25.057310 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerDied","Data":"128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d"} Feb 27 01:24:26 crc kubenswrapper[4781]: I0227 01:24:26.069665 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerStarted","Data":"15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db"} Feb 27 01:24:26 crc kubenswrapper[4781]: I0227 01:24:26.090925 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8jb27" podStartSLOduration=2.613274467 podStartE2EDuration="7.090902046s" podCreationTimestamp="2026-02-27 01:24:19 +0000 UTC" firstStartedPulling="2026-02-27 01:24:20.995835605 +0000 UTC m=+4730.253375159" lastFinishedPulling="2026-02-27 01:24:25.473463184 +0000 UTC m=+4734.731002738" observedRunningTime="2026-02-27 01:24:26.084737783 +0000 UTC m=+4735.342277337" watchObservedRunningTime="2026-02-27 01:24:26.090902046 +0000 UTC m=+4735.348441620" Feb 27 01:24:29 crc kubenswrapper[4781]: I0227 01:24:29.309271 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:24:29 crc kubenswrapper[4781]: E0227 01:24:29.309874 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:24:29 crc kubenswrapper[4781]: I0227 01:24:29.619442 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:29 crc kubenswrapper[4781]: I0227 01:24:29.619512 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:29 crc kubenswrapper[4781]: I0227 01:24:29.666602 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:30 crc kubenswrapper[4781]: I0227 01:24:30.150838 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:30 crc kubenswrapper[4781]: I0227 01:24:30.206133 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8jb27"] Feb 27 01:24:32 crc kubenswrapper[4781]: I0227 01:24:32.130711 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8jb27" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="registry-server" containerID="cri-o://15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db" gracePeriod=2 Feb 27 01:24:32 crc kubenswrapper[4781]: I0227 01:24:32.804085 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:32 crc kubenswrapper[4781]: I0227 01:24:32.996511 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmp5c\" (UniqueName: \"kubernetes.io/projected/c4ead436-ddbf-4703-971c-12f3b1a5673e-kube-api-access-zmp5c\") pod \"c4ead436-ddbf-4703-971c-12f3b1a5673e\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " Feb 27 01:24:32 crc kubenswrapper[4781]: I0227 01:24:32.996621 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-catalog-content\") pod \"c4ead436-ddbf-4703-971c-12f3b1a5673e\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " Feb 27 01:24:32 crc kubenswrapper[4781]: I0227 01:24:32.996746 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-utilities\") pod \"c4ead436-ddbf-4703-971c-12f3b1a5673e\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " Feb 27 01:24:32 crc kubenswrapper[4781]: I0227 01:24:32.997663 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-utilities" (OuterVolumeSpecName: "utilities") pod "c4ead436-ddbf-4703-971c-12f3b1a5673e" (UID: "c4ead436-ddbf-4703-971c-12f3b1a5673e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.002585 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ead436-ddbf-4703-971c-12f3b1a5673e-kube-api-access-zmp5c" (OuterVolumeSpecName: "kube-api-access-zmp5c") pod "c4ead436-ddbf-4703-971c-12f3b1a5673e" (UID: "c4ead436-ddbf-4703-971c-12f3b1a5673e"). InnerVolumeSpecName "kube-api-access-zmp5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.099060 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.099118 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmp5c\" (UniqueName: \"kubernetes.io/projected/c4ead436-ddbf-4703-971c-12f3b1a5673e-kube-api-access-zmp5c\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.144430 4781 generic.go:334] "Generic (PLEG): container finished" podID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerID="15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db" exitCode=0 Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.144539 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerDied","Data":"15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db"} Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.145761 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerDied","Data":"5c47e43f064528bc2831a2b9ccb5e3b1b6a8041ff6b8a9408e4fb27c0e6a7ceb"} Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.144556 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.145832 4781 scope.go:117] "RemoveContainer" containerID="15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.167419 4781 scope.go:117] "RemoveContainer" containerID="128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.195525 4781 scope.go:117] "RemoveContainer" containerID="e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.246854 4781 scope.go:117] "RemoveContainer" containerID="15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db" Feb 27 01:24:33 crc kubenswrapper[4781]: E0227 01:24:33.247328 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db\": container with ID starting with 15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db not found: ID does not exist" containerID="15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.247362 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db"} err="failed to get container status \"15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db\": rpc error: code = NotFound desc = could not find container \"15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db\": container with ID starting with 15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db not found: ID does not exist" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.247385 4781 scope.go:117] "RemoveContainer" containerID="128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d" Feb 27 01:24:33 crc kubenswrapper[4781]: E0227 01:24:33.247696 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d\": container with ID starting with 128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d not found: ID does not exist" containerID="128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.247730 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d"} err="failed to get container status \"128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d\": rpc error: code = NotFound desc = could not find container \"128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d\": container with ID starting with 128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d not found: ID does not exist" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.247748 4781 scope.go:117] "RemoveContainer" containerID="e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20" Feb 27 01:24:33 crc kubenswrapper[4781]: E0227 01:24:33.247976 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20\": container with ID starting with e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20 not found: ID does not exist" containerID="e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.248006 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20"} err="failed to get container status \"e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20\": rpc error: code = NotFound desc = could not find container \"e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20\": container with ID starting with e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20 not found: ID does not exist" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.294450 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4ead436-ddbf-4703-971c-12f3b1a5673e" (UID: "c4ead436-ddbf-4703-971c-12f3b1a5673e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.302581 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.481364 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8jb27"] Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.492240 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8jb27"] Feb 27 01:24:34 crc kubenswrapper[4781]: I0227 01:24:34.035998 4781 scope.go:117] "RemoveContainer" containerID="500185c8a41f1ea03fad4eed8ceeb62b2a655600fefd254d6835b485744f3e8b" Feb 27 01:24:35 crc kubenswrapper[4781]: I0227 01:24:35.319901 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" path="/var/lib/kubelet/pods/c4ead436-ddbf-4703-971c-12f3b1a5673e/volumes" Feb 27 01:24:43 crc kubenswrapper[4781]: I0227 01:24:43.310435 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:24:43 crc kubenswrapper[4781]: E0227 01:24:43.311303 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:24:58 crc kubenswrapper[4781]: I0227 01:24:58.309378 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:24:58 crc kubenswrapper[4781]: E0227 01:24:58.310279 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:25:12 crc kubenswrapper[4781]: I0227 01:25:12.309202 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:25:12 crc kubenswrapper[4781]: E0227 01:25:12.310074 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:25:27 crc kubenswrapper[4781]: I0227 01:25:27.310771 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:25:27 crc kubenswrapper[4781]: E0227 01:25:27.311988 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:25:38 crc kubenswrapper[4781]: I0227 01:25:38.309678 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:25:38 crc kubenswrapper[4781]: E0227 01:25:38.310495 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:25:52 crc kubenswrapper[4781]: I0227 01:25:52.309951 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:25:52 crc kubenswrapper[4781]: E0227 01:25:52.310796 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.149300 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535926-dk95r"] Feb 27 01:26:00 crc kubenswrapper[4781]: E0227 01:26:00.150483 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="extract-content" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.150532 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="extract-content" Feb 27 01:26:00 crc kubenswrapper[4781]: E0227 01:26:00.150557 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="extract-utilities" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.150564 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="extract-utilities" Feb 27 01:26:00 crc kubenswrapper[4781]: E0227 01:26:00.150580 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="registry-server" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.150591 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="registry-server" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.150886 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="registry-server" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.151809 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.154132 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.154395 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.155306 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.159984 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535926-dk95r"] Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.336929 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bv88\" (UniqueName: \"kubernetes.io/projected/4b2699a8-ed99-4729-8537-c56d4e3020a5-kube-api-access-8bv88\") pod \"auto-csr-approver-29535926-dk95r\" (UID: \"4b2699a8-ed99-4729-8537-c56d4e3020a5\") " pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.438985 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bv88\" (UniqueName: \"kubernetes.io/projected/4b2699a8-ed99-4729-8537-c56d4e3020a5-kube-api-access-8bv88\") pod \"auto-csr-approver-29535926-dk95r\" (UID: \"4b2699a8-ed99-4729-8537-c56d4e3020a5\") " pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.463430 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bv88\" (UniqueName: \"kubernetes.io/projected/4b2699a8-ed99-4729-8537-c56d4e3020a5-kube-api-access-8bv88\") pod \"auto-csr-approver-29535926-dk95r\" (UID: \"4b2699a8-ed99-4729-8537-c56d4e3020a5\") " pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.473840 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.960663 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535926-dk95r"] Feb 27 01:26:01 crc kubenswrapper[4781]: I0227 01:26:01.008910 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535926-dk95r" event={"ID":"4b2699a8-ed99-4729-8537-c56d4e3020a5","Type":"ContainerStarted","Data":"5b83dca4856c933ef6979aa2b51369b348695367cf72417555ffefd24a1d1e69"} Feb 27 01:26:03 crc kubenswrapper[4781]: I0227 01:26:03.030104 4781 generic.go:334] "Generic (PLEG): container finished" podID="4b2699a8-ed99-4729-8537-c56d4e3020a5" containerID="16a9f9d6ea8b6379b17f302340277b5de7121ea70fcd8873f81764601e863272" exitCode=0 Feb 27 01:26:03 crc kubenswrapper[4781]: I0227 01:26:03.030169 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535926-dk95r" event={"ID":"4b2699a8-ed99-4729-8537-c56d4e3020a5","Type":"ContainerDied","Data":"16a9f9d6ea8b6379b17f302340277b5de7121ea70fcd8873f81764601e863272"} Feb 27 01:26:04 crc kubenswrapper[4781]: I0227 01:26:04.581947 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:04 crc kubenswrapper[4781]: I0227 01:26:04.733247 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bv88\" (UniqueName: \"kubernetes.io/projected/4b2699a8-ed99-4729-8537-c56d4e3020a5-kube-api-access-8bv88\") pod \"4b2699a8-ed99-4729-8537-c56d4e3020a5\" (UID: \"4b2699a8-ed99-4729-8537-c56d4e3020a5\") " Feb 27 01:26:04 crc kubenswrapper[4781]: I0227 01:26:04.750780 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2699a8-ed99-4729-8537-c56d4e3020a5-kube-api-access-8bv88" (OuterVolumeSpecName: "kube-api-access-8bv88") pod "4b2699a8-ed99-4729-8537-c56d4e3020a5" (UID: "4b2699a8-ed99-4729-8537-c56d4e3020a5"). InnerVolumeSpecName "kube-api-access-8bv88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:04 crc kubenswrapper[4781]: I0227 01:26:04.835569 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bv88\" (UniqueName: \"kubernetes.io/projected/4b2699a8-ed99-4729-8537-c56d4e3020a5-kube-api-access-8bv88\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:05 crc kubenswrapper[4781]: I0227 01:26:05.050089 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535926-dk95r" event={"ID":"4b2699a8-ed99-4729-8537-c56d4e3020a5","Type":"ContainerDied","Data":"5b83dca4856c933ef6979aa2b51369b348695367cf72417555ffefd24a1d1e69"} Feb 27 01:26:05 crc kubenswrapper[4781]: I0227 01:26:05.050124 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:05 crc kubenswrapper[4781]: I0227 01:26:05.050127 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b83dca4856c933ef6979aa2b51369b348695367cf72417555ffefd24a1d1e69" Feb 27 01:26:05 crc kubenswrapper[4781]: I0227 01:26:05.651235 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535920-hp92r"] Feb 27 01:26:05 crc kubenswrapper[4781]: I0227 01:26:05.661020 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535920-hp92r"] Feb 27 01:26:07 crc kubenswrapper[4781]: I0227 01:26:07.310025 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:26:07 crc kubenswrapper[4781]: E0227 01:26:07.310364 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:26:07 crc kubenswrapper[4781]: I0227 01:26:07.321513 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="960f5179-f532-4fbf-90fa-e19414cbe684" path="/var/lib/kubelet/pods/960f5179-f532-4fbf-90fa-e19414cbe684/volumes" Feb 27 01:26:18 crc kubenswrapper[4781]: I0227 01:26:18.310388 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:26:18 crc kubenswrapper[4781]: E0227 01:26:18.311348 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:26:31 crc kubenswrapper[4781]: I0227 01:26:31.318759 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:26:31 crc kubenswrapper[4781]: E0227 01:26:31.319896 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:26:34 crc kubenswrapper[4781]: I0227 01:26:34.377882 4781 scope.go:117] "RemoveContainer" containerID="7e0241ade9afef50720d7328e1e27817a80d46d9126df5c01d0b6695a2c96b4c" Feb 27 01:26:45 crc kubenswrapper[4781]: I0227 01:26:45.309574 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:26:45 crc kubenswrapper[4781]: E0227 01:26:45.310429 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:27:00 crc kubenswrapper[4781]: I0227 01:27:00.309062 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:27:00 crc kubenswrapper[4781]: E0227 01:27:00.309856 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:27:15 crc kubenswrapper[4781]: I0227 01:27:15.310281 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:27:15 crc kubenswrapper[4781]: E0227 01:27:15.311408 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:27:27 crc kubenswrapper[4781]: I0227 01:27:27.309393 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:27:27 crc kubenswrapper[4781]: E0227 01:27:27.310163 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:27:40 crc kubenswrapper[4781]: I0227 01:27:40.309371 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:27:40 crc kubenswrapper[4781]: E0227 01:27:40.310246 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:27:54 crc kubenswrapper[4781]: I0227 01:27:54.309364 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:27:55 crc kubenswrapper[4781]: I0227 01:27:55.114912 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"3ca68bdb706f059286e9cab162cb1e9da3e558b32a0ae147b05ecbdd73deb984"} Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.144471 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535928-h2h9s"] Feb 27 01:28:00 crc kubenswrapper[4781]: E0227 01:28:00.145442 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2699a8-ed99-4729-8537-c56d4e3020a5" containerName="oc" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.145456 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2699a8-ed99-4729-8537-c56d4e3020a5" containerName="oc" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.145719 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b2699a8-ed99-4729-8537-c56d4e3020a5" containerName="oc" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.146533 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.148714 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.149118 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.149029 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.156018 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535928-h2h9s"] Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.222544 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdbn4\" (UniqueName: \"kubernetes.io/projected/7afe05a2-70dc-4ef0-93a1-af0fee2fd536-kube-api-access-pdbn4\") pod \"auto-csr-approver-29535928-h2h9s\" (UID: \"7afe05a2-70dc-4ef0-93a1-af0fee2fd536\") " pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.323741 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdbn4\" (UniqueName: \"kubernetes.io/projected/7afe05a2-70dc-4ef0-93a1-af0fee2fd536-kube-api-access-pdbn4\") pod \"auto-csr-approver-29535928-h2h9s\" (UID: \"7afe05a2-70dc-4ef0-93a1-af0fee2fd536\") " pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.343095 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdbn4\" (UniqueName: \"kubernetes.io/projected/7afe05a2-70dc-4ef0-93a1-af0fee2fd536-kube-api-access-pdbn4\") pod \"auto-csr-approver-29535928-h2h9s\" (UID: \"7afe05a2-70dc-4ef0-93a1-af0fee2fd536\") " pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.471078 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.952764 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535928-h2h9s"] Feb 27 01:28:01 crc kubenswrapper[4781]: I0227 01:28:01.169009 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" event={"ID":"7afe05a2-70dc-4ef0-93a1-af0fee2fd536","Type":"ContainerStarted","Data":"e288234e3a2fb7071d5906564e8fd676f6e72bef4cba6fce7f10e2a518fa17f6"} Feb 27 01:28:03 crc kubenswrapper[4781]: I0227 01:28:03.198470 4781 generic.go:334] "Generic (PLEG): container finished" podID="7afe05a2-70dc-4ef0-93a1-af0fee2fd536" containerID="b8699f3e4280117c3b1dd81cc2aea325b5c92d940a277b35186feed5736aed65" exitCode=0 Feb 27 01:28:03 crc kubenswrapper[4781]: I0227 01:28:03.198940 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" event={"ID":"7afe05a2-70dc-4ef0-93a1-af0fee2fd536","Type":"ContainerDied","Data":"b8699f3e4280117c3b1dd81cc2aea325b5c92d940a277b35186feed5736aed65"} Feb 27 01:28:04 crc kubenswrapper[4781]: I0227 01:28:04.804923 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:04 crc kubenswrapper[4781]: I0227 01:28:04.826858 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdbn4\" (UniqueName: \"kubernetes.io/projected/7afe05a2-70dc-4ef0-93a1-af0fee2fd536-kube-api-access-pdbn4\") pod \"7afe05a2-70dc-4ef0-93a1-af0fee2fd536\" (UID: \"7afe05a2-70dc-4ef0-93a1-af0fee2fd536\") " Feb 27 01:28:04 crc kubenswrapper[4781]: I0227 01:28:04.833363 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7afe05a2-70dc-4ef0-93a1-af0fee2fd536-kube-api-access-pdbn4" (OuterVolumeSpecName: "kube-api-access-pdbn4") pod "7afe05a2-70dc-4ef0-93a1-af0fee2fd536" (UID: "7afe05a2-70dc-4ef0-93a1-af0fee2fd536"). InnerVolumeSpecName "kube-api-access-pdbn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:28:04 crc kubenswrapper[4781]: I0227 01:28:04.929499 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdbn4\" (UniqueName: \"kubernetes.io/projected/7afe05a2-70dc-4ef0-93a1-af0fee2fd536-kube-api-access-pdbn4\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:05 crc kubenswrapper[4781]: I0227 01:28:05.224695 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" event={"ID":"7afe05a2-70dc-4ef0-93a1-af0fee2fd536","Type":"ContainerDied","Data":"e288234e3a2fb7071d5906564e8fd676f6e72bef4cba6fce7f10e2a518fa17f6"} Feb 27 01:28:05 crc kubenswrapper[4781]: I0227 01:28:05.225165 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e288234e3a2fb7071d5906564e8fd676f6e72bef4cba6fce7f10e2a518fa17f6" Feb 27 01:28:05 crc kubenswrapper[4781]: I0227 01:28:05.224750 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:05 crc kubenswrapper[4781]: I0227 01:28:05.871725 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535922-v2mrc"] Feb 27 01:28:05 crc kubenswrapper[4781]: I0227 01:28:05.882319 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535922-v2mrc"] Feb 27 01:28:07 crc kubenswrapper[4781]: I0227 01:28:07.329201 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4de45e6-34d0-42f2-a5ef-3db90864a559" path="/var/lib/kubelet/pods/a4de45e6-34d0-42f2-a5ef-3db90864a559/volumes" Feb 27 01:28:34 crc kubenswrapper[4781]: I0227 01:28:34.476799 4781 scope.go:117] "RemoveContainer" containerID="2f2dfc7d2070f93d2a77b14a366a4665e36d3d475590bb3af0ca699d7f8bebd2" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.167963 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535930-n6tx4"] Feb 27 01:30:00 crc kubenswrapper[4781]: E0227 01:30:00.169269 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7afe05a2-70dc-4ef0-93a1-af0fee2fd536" containerName="oc" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.169290 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7afe05a2-70dc-4ef0-93a1-af0fee2fd536" containerName="oc" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.169569 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7afe05a2-70dc-4ef0-93a1-af0fee2fd536" containerName="oc" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.170687 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.175391 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.175706 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.175724 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.187110 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7"] Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.188820 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.191807 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.192206 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.202120 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535930-n6tx4"] Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.216242 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7"] Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.254720 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxspz\" (UniqueName: \"kubernetes.io/projected/e06cf708-30fa-49dc-adab-a1f2c7990710-kube-api-access-hxspz\") pod \"auto-csr-approver-29535930-n6tx4\" (UID: \"e06cf708-30fa-49dc-adab-a1f2c7990710\") " pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.254989 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86366445-86a7-4d4e-9227-3f30e17f16a8-secret-volume\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.255054 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hlsl\" (UniqueName: \"kubernetes.io/projected/86366445-86a7-4d4e-9227-3f30e17f16a8-kube-api-access-7hlsl\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.255345 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86366445-86a7-4d4e-9227-3f30e17f16a8-config-volume\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.357467 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86366445-86a7-4d4e-9227-3f30e17f16a8-secret-volume\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.357518 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hlsl\" (UniqueName: \"kubernetes.io/projected/86366445-86a7-4d4e-9227-3f30e17f16a8-kube-api-access-7hlsl\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.357568 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86366445-86a7-4d4e-9227-3f30e17f16a8-config-volume\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.357921 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxspz\" (UniqueName: \"kubernetes.io/projected/e06cf708-30fa-49dc-adab-a1f2c7990710-kube-api-access-hxspz\") pod \"auto-csr-approver-29535930-n6tx4\" (UID: \"e06cf708-30fa-49dc-adab-a1f2c7990710\") " pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.361536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86366445-86a7-4d4e-9227-3f30e17f16a8-config-volume\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.366603 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86366445-86a7-4d4e-9227-3f30e17f16a8-secret-volume\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.381468 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxspz\" (UniqueName: \"kubernetes.io/projected/e06cf708-30fa-49dc-adab-a1f2c7990710-kube-api-access-hxspz\") pod \"auto-csr-approver-29535930-n6tx4\" (UID: \"e06cf708-30fa-49dc-adab-a1f2c7990710\") " pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.382337 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hlsl\" (UniqueName: \"kubernetes.io/projected/86366445-86a7-4d4e-9227-3f30e17f16a8-kube-api-access-7hlsl\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.498485 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.518182 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:01 crc kubenswrapper[4781]: I0227 01:30:01.157044 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7"] Feb 27 01:30:01 crc kubenswrapper[4781]: I0227 01:30:01.530514 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" event={"ID":"86366445-86a7-4d4e-9227-3f30e17f16a8","Type":"ContainerStarted","Data":"4852cc38ee40f54d89db92a7646463e89e6392f011efd9bad64c5c8b6ff1b3eb"} Feb 27 01:30:01 crc kubenswrapper[4781]: I0227 01:30:01.675519 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535930-n6tx4"] Feb 27 01:30:01 crc kubenswrapper[4781]: W0227 01:30:01.682803 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode06cf708_30fa_49dc_adab_a1f2c7990710.slice/crio-15fe8717e4c838968b868e49e039958b88e62afd1f8d3bcd1ab02b91aa6a520e WatchSource:0}: Error finding container 15fe8717e4c838968b868e49e039958b88e62afd1f8d3bcd1ab02b91aa6a520e: Status 404 returned error can't find the container with id 15fe8717e4c838968b868e49e039958b88e62afd1f8d3bcd1ab02b91aa6a520e Feb 27 01:30:01 crc kubenswrapper[4781]: I0227 01:30:01.688959 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:30:02 crc kubenswrapper[4781]: I0227 01:30:02.548952 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" event={"ID":"e06cf708-30fa-49dc-adab-a1f2c7990710","Type":"ContainerStarted","Data":"15fe8717e4c838968b868e49e039958b88e62afd1f8d3bcd1ab02b91aa6a520e"} Feb 27 01:30:02 crc kubenswrapper[4781]: I0227 01:30:02.555889 4781 generic.go:334] "Generic (PLEG): container finished" podID="86366445-86a7-4d4e-9227-3f30e17f16a8" containerID="4911eafe22c06c1a3e360ba662d163448b23902834655fe9066930069a16df01" exitCode=0 Feb 27 01:30:02 crc kubenswrapper[4781]: I0227 01:30:02.555956 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" event={"ID":"86366445-86a7-4d4e-9227-3f30e17f16a8","Type":"ContainerDied","Data":"4911eafe22c06c1a3e360ba662d163448b23902834655fe9066930069a16df01"} Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.204212 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.271970 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86366445-86a7-4d4e-9227-3f30e17f16a8-secret-volume\") pod \"86366445-86a7-4d4e-9227-3f30e17f16a8\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.272129 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hlsl\" (UniqueName: \"kubernetes.io/projected/86366445-86a7-4d4e-9227-3f30e17f16a8-kube-api-access-7hlsl\") pod \"86366445-86a7-4d4e-9227-3f30e17f16a8\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.272156 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86366445-86a7-4d4e-9227-3f30e17f16a8-config-volume\") pod \"86366445-86a7-4d4e-9227-3f30e17f16a8\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.273701 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86366445-86a7-4d4e-9227-3f30e17f16a8-config-volume" (OuterVolumeSpecName: "config-volume") pod "86366445-86a7-4d4e-9227-3f30e17f16a8" (UID: "86366445-86a7-4d4e-9227-3f30e17f16a8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.280556 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86366445-86a7-4d4e-9227-3f30e17f16a8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "86366445-86a7-4d4e-9227-3f30e17f16a8" (UID: "86366445-86a7-4d4e-9227-3f30e17f16a8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.281034 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86366445-86a7-4d4e-9227-3f30e17f16a8-kube-api-access-7hlsl" (OuterVolumeSpecName: "kube-api-access-7hlsl") pod "86366445-86a7-4d4e-9227-3f30e17f16a8" (UID: "86366445-86a7-4d4e-9227-3f30e17f16a8"). InnerVolumeSpecName "kube-api-access-7hlsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.375009 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86366445-86a7-4d4e-9227-3f30e17f16a8-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.375051 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hlsl\" (UniqueName: \"kubernetes.io/projected/86366445-86a7-4d4e-9227-3f30e17f16a8-kube-api-access-7hlsl\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.375063 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86366445-86a7-4d4e-9227-3f30e17f16a8-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.583714 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" event={"ID":"86366445-86a7-4d4e-9227-3f30e17f16a8","Type":"ContainerDied","Data":"4852cc38ee40f54d89db92a7646463e89e6392f011efd9bad64c5c8b6ff1b3eb"} Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.584074 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4852cc38ee40f54d89db92a7646463e89e6392f011efd9bad64c5c8b6ff1b3eb" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.584140 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:05 crc kubenswrapper[4781]: I0227 01:30:05.299152 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2"] Feb 27 01:30:05 crc kubenswrapper[4781]: I0227 01:30:05.324321 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2"] Feb 27 01:30:05 crc kubenswrapper[4781]: I0227 01:30:05.600127 4781 generic.go:334] "Generic (PLEG): container finished" podID="e06cf708-30fa-49dc-adab-a1f2c7990710" containerID="e1d561aa4537e2a7029bf8a15452492b8da6a5c737b211fb44b195f7580f906b" exitCode=0 Feb 27 01:30:05 crc kubenswrapper[4781]: I0227 01:30:05.600195 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" event={"ID":"e06cf708-30fa-49dc-adab-a1f2c7990710","Type":"ContainerDied","Data":"e1d561aa4537e2a7029bf8a15452492b8da6a5c737b211fb44b195f7580f906b"} Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.300644 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.321668 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" path="/var/lib/kubelet/pods/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db/volumes" Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.349176 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxspz\" (UniqueName: \"kubernetes.io/projected/e06cf708-30fa-49dc-adab-a1f2c7990710-kube-api-access-hxspz\") pod \"e06cf708-30fa-49dc-adab-a1f2c7990710\" (UID: \"e06cf708-30fa-49dc-adab-a1f2c7990710\") " Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.359240 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e06cf708-30fa-49dc-adab-a1f2c7990710-kube-api-access-hxspz" (OuterVolumeSpecName: "kube-api-access-hxspz") pod "e06cf708-30fa-49dc-adab-a1f2c7990710" (UID: "e06cf708-30fa-49dc-adab-a1f2c7990710"). InnerVolumeSpecName "kube-api-access-hxspz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.452569 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxspz\" (UniqueName: \"kubernetes.io/projected/e06cf708-30fa-49dc-adab-a1f2c7990710-kube-api-access-hxspz\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.625434 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" event={"ID":"e06cf708-30fa-49dc-adab-a1f2c7990710","Type":"ContainerDied","Data":"15fe8717e4c838968b868e49e039958b88e62afd1f8d3bcd1ab02b91aa6a520e"} Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.625539 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15fe8717e4c838968b868e49e039958b88e62afd1f8d3bcd1ab02b91aa6a520e" Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.625544 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:08 crc kubenswrapper[4781]: I0227 01:30:08.369941 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535924-jqt2s"] Feb 27 01:30:08 crc kubenswrapper[4781]: I0227 01:30:08.381750 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535924-jqt2s"] Feb 27 01:30:09 crc kubenswrapper[4781]: I0227 01:30:09.323253 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4094f7-d1b4-4771-8d18-ea76f4f2afc6" path="/var/lib/kubelet/pods/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6/volumes" Feb 27 01:30:12 crc kubenswrapper[4781]: I0227 01:30:12.895302 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:30:12 crc kubenswrapper[4781]: I0227 01:30:12.895758 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:30:34 crc kubenswrapper[4781]: I0227 01:30:34.584552 4781 scope.go:117] "RemoveContainer" containerID="ddf9edbab834e6c1a55e9fec04cd3a8734214a1dea92a8471be43663e726da2d" Feb 27 01:30:34 crc kubenswrapper[4781]: I0227 01:30:34.643965 4781 scope.go:117] "RemoveContainer" containerID="13bcf8d94b2a16937b07dfe8f4ce503b88a240b7d9c23876edfc03e06b4dceeb" Feb 27 01:30:42 crc kubenswrapper[4781]: I0227 01:30:42.895437 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:30:42 crc kubenswrapper[4781]: I0227 01:30:42.895989 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515150171725024451 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015150171726017367 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015150157377016520 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015150157377015470 5ustar corecore